Because no one believes these laws or bills or acts or whatever will be enforced.
But I actually believe they'll be. In the worst way possible: honest players will be punished disproportionally.
Time will tell. Texas' sat on its biometric data act quite quietly then hammered meta with a $1.4B settlement 20 years after the bill's enactment. Once these laws are enacted, they lay quietly until someone has a big enough bone to pick with someone else. There are already many traumatic events occurring downstream from slapdash AI development.
It's kinda funny the oft-held animosity towards EU's heavy-handed regulations when navigating US state law is a complete minefield of its own.
Sometimes folks need to be stopped. Sometimes those walls are there for a reason.
And after IPO, maybe a founder should consider the good of the world instead of what you think you want next for yourself and just bashing down more walls.
But I dunno… I’m just a rando.
The cost of indexing using third party API is extremely high, however. This might work out well with an open source model and a cluster of raspberry pi for large library indexing?
All-lowercase comes accross as the text equivalent of a hoodie and jeans: comfortable, a bit defensive against being seen as trying too hard, and now so common it barely reads as rebellion.
There is no way for the AI system to verify whether you are white hat or black hat when you are doing pen-testing if the only task is to pen-test. Since this is not part of a "broader attack" (in the context), there is no "threat".
I don't see how this can be avoided, given that there are legitime uses to every step of this in creating defenses to novel attacks.
Yes, all of this can be done with code and humans as well - but it is the scale and the speed that becomes problematic. It can adjust in real-time to individual targets and does not need as much human intervention / tailoring.
Is this obvious? Yes - but it seems they are trying to raise awareness of an actual use of this in the wild and get people discussing it.