There is a current "show your personal site" post on top of HN [1] with 1500+ comments. I wonder how many of those sites are or will be hammered by AI bots in the next few days to steal/scrape content.
If this can be used as a temporary guard against AI bots, that would have been a good opportunity to test it out.
AI bots (or clients claiming to be one) appear quite fast on new sites, at least that's what I saw recently in few places. They probably monitor Certificate Transparency logs - you won't hide by avoiding linking. Unless you are ok with staying in the shadow of naked http.
My site is hosted on Cloudflare and I trust its protection way more than flavor of the month method. This probably won't be patched anytime soon but I'd rather have some people click my link and not just avoid it along with AI because it looks fishy :)
Glad I’m not the only one who felt icky seeing that post.
I agree my tinfoil hat signal told me this was the perfect way to ask people for bespoke, hand crafted content - which of course AI will love to slurp up to keep feeding the bear.
How is AI viewing content any different from Google? I don’t even use Google anymore because it’s so filled with SEO trash as to be useless for many things.
LLM led scraping might not as it requires an LLM to make a choice to kick it off, but crawling for the purpose of training data is unlikely to be affected.
Sounds like a useful signal for people building custom agents or models. Being able to control whether automated systems follow a link via metadata is an interesting lever, especially given how inconsistent current model heuristics are.
I got one where the called script ended in ".pl" and I had a flashback to the 90s. My trousers grew into JNCOs, Limp Bizkit started playing out of nowhere and I got a massive urge to tell Slashdot that Alan Thicke had died.
I was able to get past that (Firefox on the Desktop) by clicking the "see details" button and then clicking the "ignore the risk" link. It took me a while to actually read the text too.
I shortened a link and when trying to access it in Chrome I get a red screen with this message:
Dangerous site
Attackers on the site you tried visiting might trick you into installing software or revealing things like your passwords, phone, or credit card numbers. Chrome strongly recommends going back to safety.
I think it’s perfectly reasonable to make something useless for fun, it’s an interesting idea.
But what I’d like to understand is why there are so many of the same thing. I know I’ve seen this exact idea multiple times on HN. It’s funny the first time, but once it’s done once and the novelty is gone (which is almost immediately), what’s the point of another and another and another?
I think it's just someone learning something new most of the time.
I have home made url shorteners in go, rust, java, python, php, elixir, typescript, etc. why? because I'm trying the language and this kind of project touches on many things: web, databases, custom logic, how and what design patterns can I apply using as much of the language as I can to build the thing.
Right. But the question is why redo the exact same joke? Why not come up with another twist (like the URL lengthener) or do no twist but be useful?
I’m not criticising the author or anyone who came before. I’m trying to understand the impetus between redoing a joke that isn’t yours. You don’t learn anything new by redoing the exact same gag that you wouldn’t learn by being even slightly original or making the project truly useful.
Ideas are a dime a dozen. You could make e.g. a Fonzie URL shortener (different lengths of “ayyyyy”), or an interstellar one (each is the name of a space object), or a binary one (all ones and zeroes)… Each of those would take about the same effort and teach you the same, but they’re also different enough they would make some people remember them, maybe even look at the author and their other projects, instead of just “oh, another one of these, close”.
URL Shortener is still one of the most popular System Design questions, building this project is a great way to have some experience / understanding of it, for example.
I agree. But a URL shortener with a twist isn’t just fun, it’s funny. The joke—as opposed to the usefulness—is what’s interesting about it. But when the same joke is overdone, it’s no longer funny.
> building this project is a great way to have some experience / understanding of it
I actually forgot that this had been done before until you mentioned it.
Giving the author the benefit of the doubt, they may have not seen it before, or was bored and just wanted to make a toy.
And it seems like many in HN are in enough a similar boat to me to have up voted it to trending, so at least some people found it entertaining, so it fulfilled its purpose I suppose.
It's a good question though, and I don't think anyone really knows the answer.
The author posted this project on reddit a few days ago where they mentioned their motivation: "I have a coworker who is constantly talking about the glory days of ShadyUrl, but that website has been down for several years at this point, so I figured I would create an alternative."
One reason is that not all these websites manage to make equally "creepy" links, even though the basic idea is the same. I remember one version which was a lot more alarming than the current example, with links containing a mix of suspicious content hinting at viruses, phishing, piracy/warez sites, pornography (XXX cams), and Bitcoin scams. I don't remember that website, but the current case seems rather weak by comparison.
That makes it even more confusing. If you’re making something creepy, I can see the argument for “whatever exists isn’t creepy enough, I’ll do it better” but not the reverse.
What's up with the creepy ads on this website? It seems like they are actually sketchy ads and not just fake ads for comedic effect. One shows some scammy nonsense about your device being infected and the other links to a real VPN app.
Please don’t use 3rd party relays for your URLs. It’s bad enough to have your own server, domain, etc. as single points of failure and bottlenecks without adding a 3rd party into the mix, who either themselves or someone that takes over their domain later track users, randomly redirect your users to a malicious site, or just fail.
I know people have fond memories of long ago when they thought surely some big company’s URL shortener would never be taken down and learned from that when it later was.
edit: gpt-oss 20B & 120B both eagerly visit it.
There is a current "show your personal site" post on top of HN [1] with 1500+ comments. I wonder how many of those sites are or will be hammered by AI bots in the next few days to steal/scrape content.
If this can be used as a temporary guard against AI bots, that would have been a good opportunity to test it out.
1. https://news.ycombinator.com/item?id=46618714
My site is hosted on Cloudflare and I trust its protection way more than flavor of the month method. This probably won't be patched anytime soon but I'd rather have some people click my link and not just avoid it along with AI because it looks fishy :)
I agree my tinfoil hat signal told me this was the perfect way to ask people for bespoke, hand crafted content - which of course AI will love to slurp up to keep feeding the bear.
https://jpmorgan.c1ic.link/logger_zcGFC2_bank_xss.docm
Definitely not meta
Deceptive site issue
This web page at [...] has been reported as a deceptive site and has been blocked based on your security preferences.
What's going on? I can't find any setting to disable this.
Deleted Comment
I shortened a link and when trying to access it in Chrome I get a red screen with this message:
But what I’d like to understand is why there are so many of the same thing. I know I’ve seen this exact idea multiple times on HN. It’s funny the first time, but once it’s done once and the novelty is gone (which is almost immediately), what’s the point of another and another and another?
I have home made url shorteners in go, rust, java, python, php, elixir, typescript, etc. why? because I'm trying the language and this kind of project touches on many things: web, databases, custom logic, how and what design patterns can I apply using as much of the language as I can to build the thing.
I’m not criticising the author or anyone who came before. I’m trying to understand the impetus between redoing a joke that isn’t yours. You don’t learn anything new by redoing the exact same gag that you wouldn’t learn by being even slightly original or making the project truly useful.
Ideas are a dime a dozen. You could make e.g. a Fonzie URL shortener (different lengths of “ayyyyy”), or an interstellar one (each is the name of a space object), or a binary one (all ones and zeroes)… Each of those would take about the same effort and teach you the same, but they’re also different enough they would make some people remember them, maybe even look at the author and their other projects, instead of just “oh, another one of these, close”.
Edit: I see referencnes to shadyurl in the comments and I have heard of that, but probably wouldn’t have thought of it.
https://xkcd.com/1053/
Again, this was not a criticism, but a genuine question.
URL Shortener is still one of the most popular System Design questions, building this project is a great way to have some experience / understanding of it, for example.
I agree. But a URL shortener with a twist isn’t just fun, it’s funny. The joke—as opposed to the usefulness—is what’s interesting about it. But when the same joke is overdone, it’s no longer funny.
> building this project is a great way to have some experience / understanding of it
https://news.ycombinator.com/item?id=46632329
Giving the author the benefit of the doubt, they may have not seen it before, or was bored and just wanted to make a toy.
And it seems like many in HN are in enough a similar boat to me to have up voted it to trending, so at least some people found it entertaining, so it fulfilled its purpose I suppose.
It's a good question though, and I don't think anyone really knows the answer.
I know people have fond memories of long ago when they thought surely some big company’s URL shortener would never be taken down and learned from that when it later was.
For example, the healthcare.gov emails. For links to that domain, they would still transform them with lnks.gd, even though:
1) The emails would be very long and flashy, so they're clearly not economizing on space.
2) The "shortened" URL was usually longer!
3) That domain doesn't let you go straight to the root and check where the transformed URL is going.
It's training users to do the very things that expose them to scammers!