I'm sympathetic with arguments against a lot of the surveillance we see in schools, but it does a great disservice to the anti-surveillance position when you try to use an example where it correctly identifies genuine misconduct.
The girl in the article says that this sort of thing doesn't work, yet in the next sentence she admits that it actually did work, "I’m never going to do something like that again, because the repercussions I faced were horrible." The system correctly identified her and as a result she decided to not partake in such behavior in the future. This reads like a young kid who doesn't think her actions should have consequences.
The old version of "actions not having consequences" was not getting caught. It can be enriching to rebel a little and get away with stuff. A lot of people grow into perfectly healthy adults and look back fondly on the stunts they pulled. Personally, I don't want my kids using any drugs but I don't think the answer is to put cameras in their rooms or supervising every play date until they're 26.
Perhaps the sensible solution is to make sure there is no system where children are getting legal charges for drug use. What an incredibly stupid system. They're kids. They are supposed to be making mistakes and learning from them in low-impact environments until they're old enough to be more responsible for their actions.
I am not surprised that kids have moved their social lives online.
We are building technology, to suggest no agency is helpful in avoiding any feeling of responsibility or guilt — perhaps rendering your comment within the realms of waxing philosophical.
Who better to worry about this than the people of hacker news?
From a pure mental health standpoint, sure, it’s solid advice but I think it’s narrowed the context of the broader concern too much.
An alternative to learned helplessness of “nothing you can do” is to encourage technologists to do the opposite.
Instead of forgetting about it, trying to put it out of your mind, fight for the future you want. Join others in that effort. That’s the reason society has hope — not the people shrugging as people fall by the wayside.
Depression mediation by agency feels more positive, but I don’t have a lot of experience tbh. Just a view that we, technologists, shouldn’t abdicate responsibility nor encourage others to do so.
That culture, imo, is why a large section of tech workers, consumers and commentators see the industry in a bad light. They’re not wrong.
EDIT: to add, “what problems can I personally solve” also individualises society’s ability to shape itself for the better. “What problems can I personally get involved in solving”, “what communities are trying to solve problems I care about” is perhaps the message I’d advocate for.
Cat's out of the bag. There is no legislation that will stop this. Not unless/until it has some obscene cost and AI gets locked down like nuclear weapons. But even then, it's just too simple to make these things now that the tech is known.
I sure don't know the answer but we just don't know what's coming next. Gonna have to wait and see.