Go to Twitter and click on a link going to any url on "NYTimes.com" or "threads.net" and you'll see about a ~5 second delay before t.co forwards you to the right address.
Twitter won't ban domains they don't like but will waste your time if you visit them.
I've been tracking the NYT delay ever since it was added (8/4, roughly noon Pacific time), and the delay is so consistent it's obviously deliberate.
I really wish the term hadn't been polluted this way.
> On Tuesday afternoon, hours after this story was first published, X began reversing the throttling on some of the sites, dropping the delay times back to zero. It was unknown if all the throttled websites had normal service restored.
https://archive.is/2023.08.15-210250/https://www.washingtonp...
> Yuri Orlov: [Narrating] Every faction in Africa calls themselves by these noble names - Liberation this, Patriotic that, Democratic Republic of something-or-other... I guess they can't own up to what they usually are: the Federation of Worse Oppressors Than the Last Bunch of Oppressors. Often, the most barbaric atrocities occur when both combatants proclaim themselves Freedom Fighters.
Deleted Comment
But at least I can hold them responsible for violating their own stated values. The former Twitter leadership just hid content that didn't fit theirs or third parties sensitivities and told me they are doing me a favor.
Restricting speech is always in the interests of those that have the power to shape discussions, so limiting speech is always counter productive.
The former Twitter leadership was very clear about what sort of content would be his. And is was based entirely on the type of content ahead of time. Critiquing this sort of content policy is like saying that newspapers should not be allowed to have clear standard for what is publishable in classified ads.
All claims of "I'm being oppressed" by Twitter policies have been absolutely ridiculous, and discrediting to supposed free speech advocate/absolutist positions.
Similarly discrediting is the silence on Musk's attacks on the free web and attempts at censorship of specific disprefeerred news outlets.
We all see what gets fought ago and what is not faught against, and the answer is clears the right to attack and intimidate groups with threatening behavior is defending, but actual censorship of reasonable discourse is tolerated.
This is not true. Restricting hate speech is an obvious counterexample.
Those two are enormously different, though. I'd consider myself an advocate, just as anyone who believes in a fair and free democracy should. But I am very far from being an absolutist — and I have a secret suspicion that nobody actually is. Musk certainly isn't.
It probably goes without saying that this would be an extremely unpleasant place, but there would be nowhere else to go once the last platform won.
What we have today is a number of smaller social networks, each with a different strategy to shape the conversation. It may very well be true that the creators of a platform choose editorial methods and goals that resonate with them personally, but what’s important to the dynamic of the platforms and free speech is that until we are all on that one terrible platform, that methods used to moderate your speech are nothing more than a company’s efforts to differentiate their product from others.
Restricting speech is in the interest of product differentiation. This, of course, is in the interest of the owner of the product, but it is always also in the interest of the consumer who wants a rich speech market to choose from, and who loathes the idea of a global 4chan style megasite to the exclusion of all other social media. This is why failure to limit speech in the context of a coherent speech product is always counterproductive.
You don't see this with curl/wget because they use user agent sniffing. If they don't think you're a browser they _will_ give you a Location header. To see it, capture a request in Firefox developer tools, right click on the request, copy as CURL. (May need to remove the Accept-Encoding tag and add -i to see the headers).
If the redirects were server side (setting the Location header), a blank referrer remains blank. Client side redirects will set the referral value.
From Twitter’s POV, there’s value in more fully conveying how much traffic they send to sites, even if it minorly inconveniences users.
> You don't see this with curl/wget because they use user agent sniffing. If they don't think you're a browser they _will_ give you a Location header. To see it, capture a request in Firefox developer tools, right click on the request, copy as CURL.
This is precisely why I did believe OP. This is Elon Musk we're talking about.
- `time wget https://t.co/4fs609qwWt` -> `0m5.389s`
- `time curl -L https://t.co/4fs609qwWt` -> `0m1.158s`
- `time curl -A "Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/81.0" -L https://t.co/4fs609qwW` -> 4.730 total
- `time curl -L https://t.co/4fs609qwWt` -> 1.313 total
Same request, the only difference is user-agent.
Or even like some junior dev removed an index
https://imgur.com/a/qege0O9
[Edit:] I'm still seeing it with threads.net:
It seems we've become a society that rewards bad practices with attention which is all any company on the web is trying to get, your attention.
I have a very different way of looking at this. It's not us that gives attention. It is them that take it via exploiting our evolved inflexible cognitive systems for attention/reward/desire/anger/lust. We are moths to a flame. The moth's free will isn't to blame for its inability to avoid it. Our cognitive systems are fixed, we can't just turn them off. If a sufficiently powerful dopamine-inducing technology is made, you can't just "opt out". It is not as simple as that. Any individual variation in the ability to opt out likely comes down to variation in genetics or other extraneous factors not inside one's immediate control.
This is where regulation needs to come in. Once you accept the reality that opting out is a comforting yet false illusion, you can then do something about it.
https://blog.redplanetlabs.com/2023/08/15/how-we-reduced-the...
Try submitting a URL from the following domains, and it will be automatically flagged (but you can't see it's flagged unless you log out):
Edit: about 67k sites are banned on HN. Here's a random selection of 10 of them:
For example, a recent submission (of mine):
"Luis Buñuel: The Master of Film Surrealism"
it had no discussion space because (I guess) it comes from fairobserver.com . Now, I understand that fairobserver.com may had been an hive of dubious publishing historically, but it makes little sense we cannot discuss Buñuel...
Maybe a rough discriminator (function approximator, Bayesian etc.) could try and decide (based at least on the title) whether a submission from "weak editorial board" sites seems to be material to allow posts or not.
<https://news.ycombinator.com/item?id=498910>
That grew fairly rapidly, it was at 38,719 by 30 Dec 2012:
<https://news.ycombinator.com/item?id=4984095> (a random 50 are listed).
I suspect that overwhelmingly the list continues to reflect the characteristics of its early incarnations.
Out of curiosity, what's the rationale for blocking archive.is? Legal reasons I assume?
Hacker News isn't an open-ended political site for people to post weird propaganda.
Deleted Comment
I can assure you that is Not the case with HN: on posting archive.is URL's, proof?
Look at my comment postings : https://news.ycombinator.com/threads?id=archo
Is it possible you have been shadow-banned for poor compliance to the [1]Guidelines & [2]FAQ's?
[1] : https://news.ycombinator.com/newsguidelines.html
[2] : https://news.ycombinator.com/newsfaq.html
It's not banned in comments, but it is banned in submissions. @dang (HN's moderator) confirms that here: https://news.ycombinator.com/item?id=37130177
For example, I've linked to my work, but it never occurred to me to use "Show HN".
Maybe this is no big deal? Or perhaps for new signups, it would be good to “soft force” them to read the FAQ?
It's basically HN, but you can earn small tips for submissions and comments.
Guesses it's crypto bullshit
goes to website
Yep, exactly as expected. Karma alone can mess with incentives, I cannot imagine that adding monetary incentive does anything but make it worse. Also crypto has the reverse-midas-touch from everything I've experienced first-hand or read so adding that into the mix is just another black mark.
Add that option to your curl tests.
Your humble anonymous tipster would appreciate if you do a little legwork.
Here's a simpler test I think replicates what I am indicating in GP comment, with regards to cookie handling:
Not passing a cookie to the next stage; pure GET request:
Using `-b` to pass the cookies _(same command as above, just adding `-b`)_ Look at the differences in the resulting files for 'with' and 'no' cookie. One redirect works in a timely manner. The other takes the ~4-5 seconds to redirect.What happened to net neutrality? Could it applied for this case?
This is something else - just the ego of one rich guy petulantly satisfying his inner demons.
A five-second delay may be enough to cause a measurably increase in the "stickiness" of Twitter if some people wait <5 seconds before clicking or scrolling onwards to something else.
Then they spend more time generating ad-revenue for Twitter than if they had gone off to the New York Times or something and started browsing over there.
As that rich guy happens to be the CEO, how is this not the prime example of "prioritising internal politics above what end users want"?
I thought it was about increasing short-term revenue.