You’re going to see more of this heavy-handed response, especially from smaller sites or decentralized services.
As I’ve argued on past threads about these laws: the internet was neither built nor intended for children. Nobody can get online without some adult intervention (paying for an ISP), and that’s the only age check that’s ever needed.
For everything else, it’s up to parents or guardians to implement filters, content controls, and blocks.
There are significant factions who would prefer porn be eradicated in it's entirety and laws like this just use 'protecting children' as the more agreable face to their crusade. Ironically the same people who often crow about parental autonomy and how they should be in complete control of their children's education and lives.
For all the talk about free speech and freedoms, a significant portion of the US doesn’t actually want free speech. They want free speech only for things they agree with.
This isn't even really it. If you read the section of Project2025 about porn and these sorts of age laws, then barely talk about porn at all. They lead with "transgender ideology" and such. The goal isn't to keep porn away from kids. The goal is to keep anything that offends their desired hierarchy away from kids.
For my silly little semi-private sites I will likely shut off the clear-web daemons and stick with .onion hidden services. Some will leave and that is fine with me. It's just hobby stuff for me. I will still use RTA headers [1] in the event that some day law makers come to their senses. Curious what others here will do with their forums, chat servers, etc...
I know people whose kid got a hand me down android from a friend and connects through neighbors open WiFi, public open WiFi etc…
And from what I’ve heard it’s not that uncommon for kids to do something similar when parents take away their phones.
It’s easy to say that parents should just limit access and I think they should. I definitely plan to when my kids are old enough for this to be a problem.
But kids are under extreme peer pressure to be constantly online, and when a kid is willing to go to extreme lengths to get access, it can be nearly impossible to prevent it.
There’s also more to it than what parents should do. It’s about what parents are doing. If something is very hard to do most people won’t do it. As a society we all have to deal with the consequences of bad parenting.
We don’t know the consequences of kids having access to porn, but we have correlative studies that show they probably aren’t good.
I’m more concerned with social media than porn though. The correlation between social media use and the rise in teen suicide rates looks awfully suggestive.
Here's the thing: kids are always going to be under peer pressure, and time and time again we keep falling for the pitfall trap of harming adults under the guise of protecting kids.
When it was the drug scare of the 80s, entire research about the harms of DARE's educational methods were ignored in favor of turning an entire generation of children into police informants on their parents. When it was HIV and STDs in the 90s, we harmed kids by pushing "Abstinence-only" narratives that all but ensured more adults would come down with STDs and HIV as adults due to a lack of suitable education (nevermind the reality that children are often vehicles for new information back into the household, which could've educated their own parents as to the new dangers of STDs if they'd been properly educated). In the 2000s, it was attempts to regulate violent video games instead of literal firearms, which has directly contributed to the mass shooting epidemic in the USA. And now we're turning back to porn again, with the same flawed reasoning.
It's almost like the entire point is to harm adults, not protect children.
> I’m more concerned with social media than porn though. The correlation between social media use and the rise in teen suicide rates looks awfully suggestive.
This problem isn't specific to children. Addictive and often otherwise manipulative too feeds affect people of all ages. Instead of age checks, I'd much rather address this. A starting point for how to do this could be banning algorithmic feeds and having us go back to simple algorithms like independent forum websites with latest post first display order.
> For everything else, it’s up to parents or guardians to implement filters, content controls, and blocks.
First of, I'd like to be clear, I don't think laws like this are the right way to go.
But to be fair, even if you are tech literate, which most parents aren't, this is actually pretty difficult to do.
And there are really three approaches you can take to this. You can use an allowlist of sites, but that is very restrictive, and limits the ability to explore, research, and learn how to use the internet generally. You can use a blocklist, but then you will always miss something, and it is a game of whack a mole. Or you can use some kind of AI, but that will probably both block things you don't want blocked, and allow things you do want blocked, and will probably add significant latency.
One possible way this could be improved is if websites with adult or mature content, or potential dangers to children (such as allowing the child to communicate with strangers, or gambling) returned a header that marked the content as possibly not suitable for children with a tag of the reason, and maybe a minimum age. Then a browser or firewall could be configured to block access to anything with headers for undesired content. Although, I think that would be most effective if there were laws requiring the headers to be honest.
I mostly agree with you , except there are plenty of ways for non-adults to get access to the internet without adult intervention. ( libraries, friends, McDonald’s hotspots. )
An adult still has to pay for that internet service, and at that point it's up to the adult in charge to implement sensible filters or protections. Libraries do it, schools do it, and I'm increasingly seeing it on flights and hotspots.
Now of course, a smart kid can bypass those filters (I did just that in HS), but kids will always find a way around whatever filter or guardrail you throw up as an obstacle if they really, really want something - just like how they'll pay a homeless person money to buy them booze or R-rated movie tickets or porno mags back in the day, or using fake IDs to get into bars and clubs.
But 99% of kids will be deterred simply by the existence of it. And that's enough.
> As I’ve argued on past threads about these laws: the internet was neither built nor intended for children. Nobody can get online without some adult intervention (paying for an ISP), and that’s the only age check that’s ever needed.
> For everything else, it’s up to parents or guardians to implement filters, content controls, and blocks.
Well, they are implementing the block through political pressure, and it's working
> That’s why until legal challenges to this law are resolved, we’ve made the difficult decision to block access from Mississippi IP addresses. We know this is disappointing for our users in Mississippi, but we believe this is a necessary measure while the courts review the legal arguments.
I strongly agree with this. All these jurisdictions and politicians are passing laws that they don't understand the technical foundations for. Second order effects aren't being considered.
Sometimes (only sometimes, I promise) I wonder whether this kind of legislation is being dreamt up by a think tank tasked with planning how to implement some ulterior goal (e.g. massively increased surveillance to fight crime - it's far too easy to unsert something more nefarious here). The politicians then just follow the action plan and repeat talking points from party advisors.
Have you seen Congress? It’s like Denny’s on senior appreciation day.
They had to wheel McConnell in not long ago because he physically couldn’t walk.
And like I don’t mean to shit on the elderly (directly anyway) but I dunno just spitballing here, maybe we could get some folks in there who weren’t born yet when the civil rights act was passed???
> We think this law creates challenges that go beyond its child safety goals, and creates significant barriers that limit free speech and disproportionately harm smaller platforms and emerging technologies.
This is the only correct response to such onerous legislation. Every site affected by such over-reach has a moral duty to do the same. Not that I expect them to do so.
If you think this is bad, you should see the regulatory burden imposed on small manufacturers. This is nothing. The problem is that voters don’t seem to care about regulatory requirements.
The alternate solution is to shutter all US operations and move to another jurisdiction that doesn't require these regulations, in the same way 4chan is ignoring the UK's request.
Harder to implement than an IP ban for a state, though.
I can't find the comic I saw but I can't find that notes how we tell people and kids to not give out personal information on the internet because that's unsafe.
Now we demand they give all their information and depending on the situation smile for the camera ...
...And also lets make it so they can't encrypt their messages either. Big Brother needs to make sure they aren't sending nudes to people that shouldn't be seeing them.
Wait! Wait! Is this the same state that wanted welfare recipients to be tested for drugs and it was found that the drug use by legislators was ten times higher?
They're right to point out that laws like this are primarily motivated by government control of speech. On a recent Times article about the UK's Online Safety Act:
> Luckily, we don’t have to imagine the scene because the High Court judgment details the last government’s reaction when it discovered this potentially rather large flaw. First, we are told, the relevant secretary of state (Michelle Donelan) expressed “concern” that the legislation might whack sites such as Amazon instead of Pornhub. In response, officials explained that the regulation in question was “not primarily aimed at … the protection of children”, but was about regulating “services that have a significant influence over public discourse”, a phrase that rather gives away the political thinking behind the act. They suggested asking Ofcom to think again and the minister agreed.
>> "Mississippi’s new law and the UK’s Online Safety Act (OSA) are very different. Bluesky follows the OSA in the UK. There, Bluesky is still accessible for everyone, age checks are required only for accessing certain content and features, and Bluesky does not know and does not track which UK users are under 18. Mississippi’s law, by contrast, would block everyone from accessing the site—teens and adults—unless they hand over sensitive information, and once they do, the law in Mississippi requires Bluesky to keep track of which users are children."
All arguments about age checks themselves aside, why can BlueSky implement age checks in the UK, but not Mississippi? Seems to me like the only difference would be Mississippi requiring everyone to log in, whereas currently I assume UK requires a login just for age-restricted material. (Although I don't use BlueSky in the UK, so shrugs)
They could based on group but you can get around that. Maybe they are concerned that a user using a VPN from Mississippi would cause them to break the law.
As I’ve argued on past threads about these laws: the internet was neither built nor intended for children. Nobody can get online without some adult intervention (paying for an ISP), and that’s the only age check that’s ever needed.
For everything else, it’s up to parents or guardians to implement filters, content controls, and blocks.
Ah yes, those monsters
If the school curriculum aligned with their belief system, they won't be talking about a need for control
And there is nothing on Blue sky that is not appropriate for children over 13-with parental guidance.
They do need to keep the morons, and knuckle dragging lawyers off the platform simply because of their felonious actions and prison records.
[1] - https://www.rtalabel.org/index.php?content=howtofaq#single
Deleted Comment
And from what I’ve heard it’s not that uncommon for kids to do something similar when parents take away their phones.
It’s easy to say that parents should just limit access and I think they should. I definitely plan to when my kids are old enough for this to be a problem.
But kids are under extreme peer pressure to be constantly online, and when a kid is willing to go to extreme lengths to get access, it can be nearly impossible to prevent it.
There’s also more to it than what parents should do. It’s about what parents are doing. If something is very hard to do most people won’t do it. As a society we all have to deal with the consequences of bad parenting.
We don’t know the consequences of kids having access to porn, but we have correlative studies that show they probably aren’t good.
I’m more concerned with social media than porn though. The correlation between social media use and the rise in teen suicide rates looks awfully suggestive.
Here's the thing: kids are always going to be under peer pressure, and time and time again we keep falling for the pitfall trap of harming adults under the guise of protecting kids.
When it was the drug scare of the 80s, entire research about the harms of DARE's educational methods were ignored in favor of turning an entire generation of children into police informants on their parents. When it was HIV and STDs in the 90s, we harmed kids by pushing "Abstinence-only" narratives that all but ensured more adults would come down with STDs and HIV as adults due to a lack of suitable education (nevermind the reality that children are often vehicles for new information back into the household, which could've educated their own parents as to the new dangers of STDs if they'd been properly educated). In the 2000s, it was attempts to regulate violent video games instead of literal firearms, which has directly contributed to the mass shooting epidemic in the USA. And now we're turning back to porn again, with the same flawed reasoning.
It's almost like the entire point is to harm adults, not protect children.
This problem isn't specific to children. Addictive and often otherwise manipulative too feeds affect people of all ages. Instead of age checks, I'd much rather address this. A starting point for how to do this could be banning algorithmic feeds and having us go back to simple algorithms like independent forum websites with latest post first display order.
Then why isn't that significantly regulated?
First of, I'd like to be clear, I don't think laws like this are the right way to go.
But to be fair, even if you are tech literate, which most parents aren't, this is actually pretty difficult to do.
And there are really three approaches you can take to this. You can use an allowlist of sites, but that is very restrictive, and limits the ability to explore, research, and learn how to use the internet generally. You can use a blocklist, but then you will always miss something, and it is a game of whack a mole. Or you can use some kind of AI, but that will probably both block things you don't want blocked, and allow things you do want blocked, and will probably add significant latency.
One possible way this could be improved is if websites with adult or mature content, or potential dangers to children (such as allowing the child to communicate with strangers, or gambling) returned a header that marked the content as possibly not suitable for children with a tag of the reason, and maybe a minimum age. Then a browser or firewall could be configured to block access to anything with headers for undesired content. Although, I think that would be most effective if there were laws requiring the headers to be honest.
18 years ago was 2007! If "most parents" of underage children don't understand the internet, where the hell have they been?
Now of course, a smart kid can bypass those filters (I did just that in HS), but kids will always find a way around whatever filter or guardrail you throw up as an obstacle if they really, really want something - just like how they'll pay a homeless person money to buy them booze or R-rated movie tickets or porno mags back in the day, or using fake IDs to get into bars and clubs.
But 99% of kids will be deterred simply by the existence of it. And that's enough.
Deleted Comment
> For everything else, it’s up to parents or guardians to implement filters, content controls, and blocks.
Well, they are implementing the block through political pressure, and it's working
Deleted Comment
I strongly agree with this. All these jurisdictions and politicians are passing laws that they don't understand the technical foundations for. Second order effects aren't being considered.
Dead Comment
They had to wheel McConnell in not long ago because he physically couldn’t walk.
And like I don’t mean to shit on the elderly (directly anyway) but I dunno just spitballing here, maybe we could get some folks in there who weren’t born yet when the civil rights act was passed???
This is the only correct response to such onerous legislation. Every site affected by such over-reach has a moral duty to do the same. Not that I expect them to do so.
Harder to implement than an IP ban for a state, though.
Now we demand they give all their information and depending on the situation smile for the camera ...
> Luckily, we don’t have to imagine the scene because the High Court judgment details the last government’s reaction when it discovered this potentially rather large flaw. First, we are told, the relevant secretary of state (Michelle Donelan) expressed “concern” that the legislation might whack sites such as Amazon instead of Pornhub. In response, officials explained that the regulation in question was “not primarily aimed at … the protection of children”, but was about regulating “services that have a significant influence over public discourse”, a phrase that rather gives away the political thinking behind the act. They suggested asking Ofcom to think again and the minister agreed.
https://www.thetimes.com/comment/columnists/article/online-s...
I think what bluesky did is the only way to fight these laws that all it will do is be a boon to people who obtain and sell PI.
For people in Mississippi, you can always get a VPN. You should avoid Free VPNs, but that is your decision.
>> "Mississippi’s new law and the UK’s Online Safety Act (OSA) are very different. Bluesky follows the OSA in the UK. There, Bluesky is still accessible for everyone, age checks are required only for accessing certain content and features, and Bluesky does not know and does not track which UK users are under 18. Mississippi’s law, by contrast, would block everyone from accessing the site—teens and adults—unless they hand over sensitive information, and once they do, the law in Mississippi requires Bluesky to keep track of which users are children."
Mississippi: They track "underage" and "adult" UK: They track "unknown [treated as underage]" and "adult"