Readit News logoReadit News
danudey commented on AI is ummasking ICE officers. Can Washington do anything about it?   politico.com/news/2025/08... · Posted by u/petethomas
like_any_other · 2 days ago
More and more it seems that a country is fundamentally not allowed to say "no" to immigration. Even the ~1 million/year for the last 25 years [1] that the US has admitted legally is deemed too restrictive, so those who try to enforce immigration law are attacked. No position short of "America belongs to everyone" is permitted, no matter what voters says.

I wonder if experts will emerge to call this inciting "stochastic terrorism" [2]. I won't be holding my breath.

[1] https://en.wikipedia.org/wiki/United_States_immigration_stat...

[2] https://en.wikipedia.org/wiki/Stochastic_terrorism

danudey · 2 days ago
Saying "no" to immigration is one thing; masked unidentified thugs surrounding a student with a legal visa in the streets and throwing her into an unmarked van to deport her with no warning isn't anti-immigration, it's a violation of civil rights.

Shipping someone to a concentration camp in El Salvador despite the fact that a federal judge ordered that they not be sent to El Salvador, insisting without evidence that they were a member of a gang, saying anyone who wants them to be able to defend themselves in court is pro-gang-violence, and then insisting that there's nothing you can do to get them back so everyone should stop complaining and move in... that's some fascist secret police shit.

danudey commented on AI is ummasking ICE officers. Can Washington do anything about it?   politico.com/news/2025/08... · Posted by u/petethomas
danudey · 2 days ago
Really? Because they were wearing masks from the start, long before the pushback started.

The reality is that they're wearing masks so that they can act with impunity; they behave as though they're above the law because they know that the people they're working for will never try to hold them accountable, and as long as they're anonymous the public can't hold them accountable either. This means they can do whatever they want to whoever they want and nothing will come of it, so if they want to harass and assault minorities they can do so freely.

danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
ForHackernews · 2 days ago
Rate-limits? Use a CDN? Lots of traffic can be a problem whether it's bots or humans.
danudey · 2 days ago
You realize this entire thread is about a pitch from a CDN company trying to solve an issue that has presented itself at such a scale that this is the best option they can think of to keep the web alive, right?

"Use a CDN" is not sufficient when these bots are so incredibly poorly behaved, because you're still paying for that CDN and this bad behavior is going to cost you a fortune in CDN costs (or cost the CDN a fortune instead, which is why Cloudflare is suggesting this).

danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
wvenable · 2 days ago
Is that really the problem we are discussing? I've had people attack my server and bring it down. But that has nothing to do with being free and open to everyone. A top hacker news post could take my server.
danudey · 2 days ago
Yes, because a top hacker news post takes your server down because a large number of actual humans are looking to gain actual value from your posts. Meanwhile, you stand to benefit from the HN discussion by learning new things and perspectives from the community.

The AI bot assault, on the other hand, is one company (or a few companies) re-fetching the same data over and over again, constantly, in perpetuity, just in case it's changed, all so they can incorporate it into their training set and make money off of it while giving you zero credit and providing zero feedback.

danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
chongli · 2 days ago
They were okay with it when Google was sending them traffic. Now they often don’t. They’ve broken the social contract of the web. So why should the sites whose work is being scraped be expected to continue upholding their end?
danudey · 2 days ago
Not only are they scraping without sending traffic, they're doing so much more aggressively than Google ever did; Google, at least, respected robots.txt and kept to the same user-agent. They didn't want to index something that a server didn't want indexed. AI bots, on the other hand, want to index every possible thing regardless of what anyone else says.
danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
wvenable · 2 days ago
You have a problem with badly behaved scrapers, not AI.

I can't disagree with being against badly behaved scrapers. But this is neither a new problem or an interesting one from the idea of making information freely available to everyone, even rhinoceroses, assuming they are well behaved. Blocking bad actors is not the same thing as blocking AI.

danudey · 2 days ago
Badly behaved scrapers are not a new problem, but badly behaved scrapers run by multibillion-dollar companies which use every possible trick to bypass every block or restriction or rate limit you put in front of them is a completely new problem on a scale we've never seen before.
danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
arjie · 2 days ago
The dream is real, man. If you want open content on the Internet, it's never been a better time. My blog is open to all - machine or man. And it's hosted on my home server next to me. I don't see why anyone would bother trying to distinguish humans from AI. A human hitting your website too much is no different from an AI hitting your website too much.

I have a robots.txt that tries to help bots not get stuck in loops, but if they want to, they're welcome to. Let the web be open. Slurp up my stuff if you want to.

Amazonbot seems to love visiting my site, and it is always welcome.

danudey · 2 days ago
> I don't see why anyone would bother trying to distinguish humans from AI.

Because a hundred thousand people reading a blog post is more beneficial to the world than an AI scraper bot fetching my (unchanged) blog post a hundred thousand times just in case it's changed in the last hour.

If AI bots were well-behaved, maintained a consistent user agent, used consistent IP subnets, and respected robots.txt, I wouldn't have a problem with them. You could manage your content filtering however you want (or not at all) and that would be that. Unfortunately at the moment, AI bots do everything they can to bypass any restrictions or blocks or rate limits you put on them; they behave as though they're completely entitled to overload your servers in their quest to train their AI bots so they can make billions of dollars on the new AI craze while giving nothing back to the people whose content they're misappropriating.

danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
zzo38computer · 2 days ago
Yes, I think that you are right (although rate limiting can sometimes be difficult to work properly).

Delegation of authorization can be useful for things that require it (as in some of the examples given in the article), but public files should not require authorization nor authentication for accessing it. Even if delegation of authorization is helpful for some uses, Cloudflare (or anyone else, other than whoever is delegating the authorization) does not need to be involved in them.

danudey · 2 days ago
> public files should not require authorization nor authentication for accessing it

Define "public files" in this case?

If I have a server with files, those are my private files. If I choose to make them accessible to the world then that's fine, but they're still private files and no one else has a right to access them except under the conditions that I set.

What Cloudflare is suggesting is that content owners (such as myself, HN, the New York Times, etc.) should be provided with the tools to restrict access to their content if unfettered access to all people is burdensome to them. For example, if AI scraper bots are running up your bandwidth bill or server load, shouldn't you be able to stop them? I would argue yes.

And yet you can't. These AI bots will ignore your robots.txt, they'll change user agents if you start to block their user agents, they'll use different IP subnets if you start to block IP subnets. They behave like extremely bad actors and ignore every single way you can tell them that they're not welcome. They take and take and provide nothing in return, and they'll do so until your website collapses under the weight and your readers or users leave to go somewhere else.

danudey commented on The web does not need gatekeepers: Cloudflare’s new “signed agents” pitch   positiveblue.substack.com... · Posted by u/positiveblue
ctoth · 2 days ago
We started with "AI crawlers are too aggressive" and you've escalated to volumetric DDoS. These aren't the same problem. OpenAI hitting your API too hard is solved by caching, not by Cloudflare deciding who gets an "agent passport."

"Victim blaming"? Can we please leave these therapy-speak terms back in the 2010s where they belong and out of technical discussions? If expecting basic caching is victim blaming, then so is expecting HTTPS, password hashing, or any technical competence whatsoever.

Your decentralization point actually proves mine: yes, attackers distribute while defenders centralize. That's why we shouldn't make centralization mandatory! Right now you can choose Cloudflare. With attestation, they become the web's border control.

The fine article makes it clear what this is really about - Cloudflare wants to be the gatekeeper for agent traffic. Agent attestation doesn't solve volumetric attacks (those need the DDoS protection they already sell, no new proposal required!) They're creating an allowlist where they decide who's "legitimate."

But sure, let's restructure the entire web's trust model because some sites can't configure a cache. That seems proportional.

danudey · 2 days ago
OpenAI hitting your static, cached pages too hard and costing you terabytes of extra bandwidth that you have to pay for (both in bandwidth itself and data transfer fees) isn't solved by caching.

The post you're replying to points out that, at a certain scale, even caching things in-memory can cause your system to fall over when a user agent (e.g. AI scraper bots) are behaving like bad actors, ignoring robots.txt, and fetching every URL twenty times a day while completely ignoring cache headers/last modified/etc.

Your points were all valid when we were dealing with either "legitimate users", "legitimate good-faith bots", and "bad actors", but now the AI companies' need for massive amounts of up-to-the-minute content at all costs means that we have to add "legitimate bad-faith bots" to the mix.

> Agent attestation doesn't solve volumetric attacks (those need the DDoS protection they already sell, no new proposal required!) They're creating an allowlist where they decide who's "legitimate."

Agent attestation solves overzealous AI scraping which looks like a volumetric attack, because if you refuse to provide the content to the bots then the bots will leave you alone (or at least, they won't chew up your bandwidth by re-fetching the same content over and over all day).

danudey commented on Framework Laptop 16   frame.work/ro/en/laptop16... · Posted by u/susanthenerd
kibwen · 5 days ago
Hardware-wise, no, I've had plenty of PC trackpads that are better than Apple trackpads. But MacOS tends to have better built-in support for advanced gestures, which seem to be impossible on Windows and must be manually configured on Linux (but gives you enormous power once you do).
danudey · 5 days ago
Apple's palm rejection is also top tier, though other systems have been getting better. My current Dell seems fine so far, but at my last company the Dell I had was almost unusable due to my cursor just teleporting around my document randomly if my hands got too close to the trackpad (which is where they have to be to type).

Not sure if it's a hardware (Dell) or software (Ubuntu) improvement, but thank god.

u/danudey

KarmaCake day8358July 8, 2009
About
You can find me as <my HN username> on twitter, skype, instagram, @me.com, and so on.
View Original