Readit News logoReadit News
casenmgreen · a year ago
It looks like Twitter is suppressing posts until they are spammed by hate bots and then making those posts visible.

https://bsky.app/profile/willhaycardiff.bsky.social/post/3lk...

I've also seen evidence of posts Twitter likes (violent and hateful anti-immigration posts - literally a photo of a dummy tied to a chair being shot in the back of the head) being spammed by love bots.

Twitter seems to be a propaganda channel, run by Donald/Elon/et al.

Tireings · a year ago
It's a propaganda platform since Musk bought it.

Im saying this for ages and never joked.

Plenty of real situations happening like blocking certain people, stoping of fact checking, bot protection and detection etc.

There is a reason why Twitter needed more people before

Dead Comment

gruez · a year ago
>It looks like Twitter is suppressing posts until they are spammed by hate bots and then making those posts visible.

>https://bsky.app/profile/willhaycardiff.bsky.social/post/3lk...

This could also very well be explained by a ranking algorithm that optimizes for "engagement". Getting spammed by hate bots = "engagement". This would be perfectly consistent with what the guy is experiencing, minus the accusation that the platform is suppressing anti-ukraine posts, which is totally unsubstantiated.

casenmgreen · a year ago
As I understanding the timing, the post was suppressed until the hate bots spammed it.

Given the post was suppressed, how did the hate bots know about it to spam it?

It seems to me Twitter suppressed the post until they had time to spam it with hate posts.

Bear in mind here also this suppression did not happen for other posts - only for the pro-Ukraine post - so Twitter at the least is specifically suppressing pro-Ukraine posts.

OmarShehata · a year ago
important notes from the essay, this not unique to twitter:

> And if you think this only happens on one social network, you’re already caught in the wrong attention loop.

> The most effective influence doesn’t announce itself. It doesn’t censor loudly, or boost aggressively. It shapes perception quietly — one algorithmic nudge at a time.

readhistory · a year ago
FWIW, I don't think the person you are responding to said it only happens on Twitter. Just that it happens on Twitter.
awkwabear · a year ago
This definitely happens on other platforms as well but there is a key difference in noting that twitter is now privately owned by a single person who has shown themself to be insecure and prone to lashing publicly at critics.

I think twitter is uniquely concentrated in its influence by its owner and willingness to do things so blatantly, other platforms need to at least pretend to not steer things so directly as not to upset shareholders.

numpad0 · a year ago
Yeah, artificial delays in content delivery is silently spreading. It's not just Twitter.

Edit: is this why 4chan was hit with the disruption - because there's no room for this delay mechanism?

kmeisthax · a year ago
No, 4chan was hacked because hideyuki has terrible security hygiene and didn't update shit. Same thing happened to 2channel a decade prior.
red-iron-pine · a year ago
social media is either making money via ads, or making money by shaping consensus.

what approach do you think HN takes, and how many bots do you think are here? Cuz I don't see any ads...

hn1986 · a year ago
Billionaire buys social network for instant cultural and political influence. Including amplifying his own posts. Yet, hardly any alarm from the tech or mainstream

Dead Comment

zoogeny · a year ago
It is a bit chilling because of the compound interest that this kind of policy incentivizes. Once you have a handful of powerful X accounts, you have the ability to generate more. So not only can you work to silence others, you can work to increase your capacity to silence others by promoting like-minded allies.

We are at the early stages of this, so we are watching the capture of influence. There is some discussion that influence is the new capital. And we are replicating the systems that allow for the accumulation of capital in this new digital age.

jandrese · a year ago
It's hard to see how this wasn't by design. Elon loudly released the source code to the algorithm so SEO engineers could optimize their systems to have total control over the narrative. Sure "anybody can read it", but realistically only propagandists are going to go to the trouble and then have the time and resources to act on it.

He basically handed the site over to the IRA and told them to go nuts.

tclancy · a year ago
The ‘ra? Did I miss a step here?
gruez · a year ago
>IRA

Irish republican army?

razster · a year ago
That dynamic of influence compounding certainly echoes the historical patterns we’ve seen with capital—those who have it can shape systems to acquire more. But it’s worth remembering that this only holds power if we choose to participate.

Personally, I’ve stepped away from anything associated with X.com or Elon Musk. I deleted my accounts, disconnected from the ecosystem entirely—and life is better for it. No doomscrolling, no algorithmic nudging, no subtle behavioral conditioning. Influence may be the new capital, but opting out is still a form of agency. Disengagement can be a powerful act.

We often forget: participation isn’t mandatory.

stevenAthompson · a year ago
I was going to buy a Tesla. My brother had one and I coveted it. They make neat stuff.

Then Elon started taking testosterone (or whatever it was that jacked up his aggression), using psychedelics, and became incapable of keeping his mouth shut. To compound it he then got involved in politics.

Now I will never buy a Tesla, starlink, or anything else he's involved in because his behavior represents a real risk that any of those companies might cease to exist if Elon gets high and does something stupid, then I'll be stuck without support.

Similarly, a social media account is an investment. I would never invest my time into building relationships on a platform like X. Even if it does survive Musk, the community is broken permanently.

zoogeny · a year ago
I think we should be careful of too much cynicism (although too little is bad as well). There is the old Aesop tale of the fox and the grapes. Being unable to reach the grapes the fox sulks away saying "they were probably sour".

There is a lot to gain for the powerful if they can convince those that they wish to hold that power over that the "grapes are sour", so to speak. That leaves less people fighting for the few grapes available, as we stretch this analogy to its breaking point.

No man is an island, and all that. If the holders of influence decide to start a war, you are in it if you like it or not.

archagon · a year ago
Yes, but eventually normal people will just end up leaving.
Jordan-117 · a year ago
It reminds me of Voat.co, a social news aggregator that promoted itself as a free-speech haven in an attempt to pick up disaffected Redditors during a series of moderation crackdowns circa 2015. It was initially pretty normal:

https://web.archive.org/web/20150501033432/https://voat.co/

But then they instituted karma-based throttling on participation:

https://web.archive.org/web/20170520210511/https://voat.co/v...

That, plus the influx of racists and misogynists chased off of Reddit, led to a snowball effect where the bigots upvoted themselves into power-user status and censored anyone who stood against them, which discouraged normies from sticking around, which further entrenched the bigotry. Within a few years, virtually every single new post on the site was radically right-wing, blatantly racist/sexist/antisemitic neo-Nazi shit:

https://web.archive.org/web/20200610022710/https://voat.co/

The site shut down by the end of 2020 from lack of funding.

You can see basically the same thing happening on Xitter, it's just slower because the starting userbase was so much larger, and Elon (for now) can continue to bankroll it.

ceejayoz · a year ago
AKA the “Nazi bar” problem.

https://en.wiktionary.org/wiki/Nazi_bar

mrdoops · a year ago
Manufactured consensus is everywhere there is enough attention to incentivize such an effort. The worst by far is Reddit.
jampa · a year ago
I've been using Reddit for 12 years. After the API fiasco, the quality dropped a lot. Most popular subreddits are now astroturfed, where every week there is a crusade against something (First it was for banning Twitter, now it is against banning AI Art).

Even in regular posts, Reddit has been a hive mind lately. If you scroll through the comments, most of them will have the same opinion over and over, with comments that add nothing to the discussion, like "I agree," getting hundreds of upvotes.

kridsdale3 · a year ago
I've been there for 17 (!) years, and I could have written pretty much the same message as you since around 2012. Dennis Kucinich was a huge campaign!

But I agree, since the API thing, it has sucked HARD.

Aurornis · a year ago
> I've been using Reddit for 12 years. After the API fiasco, the quality dropped a lot. Most popular subreddits are now astroturfed, where every week there is a crusade against something (First it was for banning Twitter, now it is against banning AI Art).

This didn’t start with the API change drama. The API change protests were their own crusade. The calls to ban Twitter links or AI art are just the next iterations of the same form of protest.

Many of the big subs were thoroughly astroturfed long before the API changes. The famous ones like /r/conservative weren’t even trying to hide the fact that they curated every member and post

DustinBrett · a year ago
Happy to see posts like this, I have the same experience. It fell apart a few years ago with the fiasco's and it's a shell of what it was now. Total echo chamber. Sadly seems to be spreading to HN in some comment sections. And X has it's problems in the other direction. There aren't many places left like how it was, when up and down votes meant something.
andrepd · a year ago
> Reddit has been a hive mind lately. If you scroll through the comments, most of them will have the same opinion over and over, with comments that add nothing to the discussion, like "I agree," getting hundreds of upvotes.

That has been the case for over 10 years now. It's absolutely not a new phenomenon.

Deleted Comment

unethical_ban · a year ago
The API shutdown allows a flooding of bots, crippled 3rd party apps and the moderator tools that kept things clean.

But I don't think the "crusades" are always bot related. Movements get momentum.

os2warpman · a year ago
> now it is against banning AI Art

AI art does not exist. There is only slop stolen from artists.

windsignaling · a year ago
Not just lately. See /r/politics. Sometime in the last 5-7 years /r/science or /r/technology (or both, I forget because I stopped reading) basically became the science/tech versions of /r/politics.
chneu · a year ago
Lol dude reddit has been heavily manipulated since like 2013, if not earlier.

I was heavily involved in buying/selling spam accounts for years on reddit. If you think it wasn't heavily manipulated, at least the frontpage, then lol you were buying it like everyone else.

dmonitor · a year ago
> banning Twitter

This is just practical given you can't see tweet threads (and sometimes even tweets) without an account.

> against banning AI Art

I think you mean to say reddit is pro-banning AI art?

Anyway, banning AI art is absolutely good for curating quality posts. AI art is incredibly low-effort, easily spammable, and has legitimate morality concerns among artist communities (the kind that post high quality content). Same goes for obviously AI-written posts.

I agree content quality on the site has fallen drastically, but those are both measures to try and save it.

guywithahat · a year ago
For all the negative things one can say about X, their fact checking (community notes) has actually gotten pretty good, which is something Reddit has yet to implement. Pew has also been ranking them more politically center than most social media sites, although I suppose that's subjective
jandrese · a year ago
I like the community notes as a concept, but they're often a day late and a dollar short. By the time the community note appears the post has been squeezed of all of its juice and was already on the way out. It's better than nothing, but the entire mechanism runs slower than the speed of propaganda.
jimbob45 · a year ago
Reddit has stickied posts at the top of each thread. Well-moderated subreddits use them to great effect. Badly moderated subreddits just shadowban everything that doesn’t match with the mods’ politics.
tough · a year ago
tik tok recently added Footnotes
ty6853 · a year ago
The most glaring example of this was how reddit did a total 180 before/after the election. Before the election questioning putting a candidate in without a primary was sacrilege. Afterwards it was a popularly supported reason for the loss. It was like watching an inflated balloon of propaganda deflate.
meroes · a year ago
After the election, the amount of [Removed by Reddit] went from very little, to EVERYWHERE.

That's what did it for me, zero Reddit unless I can't find the information anywhere else, and even then it's for viewing a single post and then I'm gone.

cmdli · a year ago
In the few days following the election, there was a flood of conservative posters all over the place. After about a week, they all disappeared and Reddit returned to its usual politics. I think the difference you are seeing is an atypical amount of conservatism, not the other way around. Most people who voted for Harris still do not think that the lack of a primary was the issue.
alabastervlog · a year ago
That's bizarre. Putting her at the top of the ticket was very clearly the better of two bad options (it was too late for the better options, by the time the call was made).

There exist people who think Biden had a better shot and replacing him with Harris was a mistake? Did they not look at his approval ratings earlier that year, then look up what that's historically meant for presidential re-elections? Dude was gonna lose, and by the time of the replacement he was likely gonna get crushed. The replacement probably helped down-ballot races, given how badly Bien was likely to perform, so was a good idea even though she lost.

Like, yes, it was per se bad but people blaming that for the defeat is... confusing to me.

dmonitor · a year ago
That's just hindsight being 20:20
Klonoar · a year ago
This slightly speaks to what subreddits a person reads, because I can tell you I had the exact opposite experience. People seemed still very pissed off about it.
raffraffraff · a year ago
Was just gonna say this. Reddit is dreadful. Anything remotely contentious has a single narrative, and if people try to present any alternative perspective, comments get locked. Disagreement = "hate".
viccis · a year ago
Reddit is SO MUCH WORSE than most people understand. Ignoring for a moment that peoples frontpage Best sort uses engagement metrics rather than upvote/downvotes since 2021, the moderators there have an iron grip over what is allowed.

r/redditminusmods used to track this. Every 12 hours they'd take a snapshot of the top 50 posts and then check ones from the previous 12 hour snapshot to see what percentage had been deleted. When it started, it was averaging 20% or so. By the end, it was at 50/50 or 49/50 deleted almost every single 12 hour period.

Of course, reddit couldn't allow this level of scrutiny, so they banned that subreddit for unstated reasons, and now the only good google result for it actually leads back here. See for yourself how bad it was: https://news.ycombinator.com/item?id=36040282

That only goes to two years ago. It feels like it's gotten even worse since then. That's not even going into some subreddits (worldnews, politics, etc.) creating the illusion of consensus by banning anyone with an opinion outside of a narrow range of allowed ones.

jandrese · a year ago
> r/redditminusmods used to track this. Every 12 hours they'd take a snapshot of the top 50 posts and then check ones from the previous 12 hour snapshot to see what percentage had been deleted. When it started, it was averaging 20% or so. By the end, it was at 50/50 or 49/50 deleted almost every single 12 hour period.

Is this "mods run amok" or is it the bots gaming the algorithm more effectively and now account for nearly half of all new popular content?

In general my advice to anyone considering Reddit is to start with the default list of subreddits that you get when not logged in. Delete all of those from your list, and track down the small subreddits that interest you. The defaults are all owned by professional influence peddlers now, and what little actual content seeps through is not worth the effort to filter out.

omneity · a year ago
This would be such an interesting experiment to perform on other social platforms as well alongside some rough semantic analysis to understand which topics are being silenced.

I already got quite a lot of the data pipeline setup for this, so if anyone wants to collab hit me up!

richwater · a year ago
> The worst by far is Reddit

The website is truly unusable unless you directly go to small niche subreddits and even then you roll the dice with unpaid mods with a power complex.

adeeds · a year ago
The smaller niche subreddits dedicated to a hobby or type of product are actually some of the worst for astroturfing from what I've seen. It only takes a few shills to start building consensus.

There's a really interesting pattern where you'll see one person start a thread asking "Hey, any recs for durable travel pants?" Then a second comment chimes in "No specific brands, just make sure you get ones that have qualities x, y, and z". Then a third user says "Oh my Ketl Mountain™ travel pants have those exact traits!" Taken on their own the threads look fairly legit and get a lot of engagement and upvotes from organic users (maybe after some bot upvoted to prime the pump)

Then if you dump the comments of those users and track what subreddits they've interacted on, they've had convos following the same patterns about boots in BuyItForLife, Bidets in r/Bidets, GaN USB chargers in USBCHardware, face wash in r/30PlusSkincare, headphones, etc. You can build a whole graph of shilling accounts pushing a handful of products.

baq · a year ago
This had been as true when I joined ~15 years ago as it has been true on the day they made me quit cold turkey when they took the API away.
RankingMember · a year ago
It's great for web searches for answers to very specific questions. "search term" + "reddit" typically gives me a good starting point if not the answer itself to the odd question I have.
raffraffraff · a year ago
I detest having to keep an account, but unfortunately there a bunch of different products that use it as a semi-official support forum.
raverbashing · a year ago
And use old, the only interface not designed with the tiktok brain in mind

(and the mobile app is just atrocious, RIF was way better in usability, etc)

ethagknight · a year ago
Manufactured consensus is literally the name of the game for the big news networks. News is/was paid vast sums by the government to tell a certain story. That is Manufactured Consensus. Some countries do a better job making the news seem like a separate arm from the government. The entire point is to direct the populace. That is not the core focus of X, even though it is entirely susceptible to it, and will be encountered on any such platform. yes Reddit is horrible, but I would say Wikipedia is even more dangerous because it presents as basic facts. Reddit at least you know it's some obscene username giving geopolitical strategy rants.

Important to note, I first saw this specific chart and claim of Musk's heavy handed influence via X. Also, I see plenty of dissenting opinions (in a general sense on Trump, Tariffs, Musk, DOGE, etc) on X. Alternative views definitely have reach.

Also important to note, my posts, where I am very knowledgeable in my domain and will spend an unreasonable amount of time authoring posts to make various points, will garner mere double digit views, so when someone cries about no longer have millions of views for their uneducated hot takes... spare the tears.

seadan83 · a year ago
Outside of PBS, do you have evidence for this claim: "News is/was paid vast sums by the government to tell a certain story"?

> Alternative views definitely have reach.

Yes, but are we in a 1984 situation where that reach is managed behind the scenes. Reach, but perhaps not too much reach. With respect to the chart, how do we know that Twitter users are not largely partitioned? How representative is the fact you saw something compared to other "communities" on X?

All the while, even if you saw a 'dissenting' chart, the fact the chart exists is direct evidence to the power of a subtle shadow-ban effect. It's not about tears and whining, it's that a single act by 'powerful' accounts can control who gets visibility, and who does not. The point is that it is not you, the community that controls what is popular, but it is the powerful accounts that do. That is the issue.

jandrese · a year ago
> News is/was paid vast sums by the government to tell a certain story.

In the US it is not the government paying these sums, it is the billionaires who bought the media outlets. When you look for editorial bias in the US it's not pro-government, it's pro-wealth. Or more specifically pro-wealthy people.

> I would say Wikipedia is even more dangerous because it presents as basic facts.

Can you give some examples of political bias in Wikipedia articles?

smallmancontrov · a year ago
Musk didn't just put a thumb on the scale in favor of far-right content, he sat his entire pre-ozempic ass on the scale.
austin-cheney · a year ago
I deleted my Reddit account years ago because of echo chamber effect and other people intentionally using that to direct opinion. In all fairness though there is an inherent narcissistic incentive to influence popular opinion irrespective of evidence or consequences. This will continue to be true so long as people rely upon social acceptance as a form of currency.
gruez · a year ago
I'm surprised how many upvotes this got (40 points as of me writing this comment), given how little "meat" is actually in this article. The author presents a graph where views for a given user dropped precipitously after a "feud with musk". That's certainly suspicious, and was worth bringing up, but the rest of the blog is just pontificating about "social engineering" and "perception cascades", backed by absolutely nothing. Are people just upvoting based on title and maybe the first paragraph? This post could have been truncated to the graph and very little would be lost.
freehorse · a year ago
Yeah I also hoped that the article had some more backing for these arguments. The nytimes article, which is cited and from which the first graph is from, is more interesting, as it also includes a couple more cases:

https://www.nytimes.com/interactive/2025/04/23/business/elon...

or from webarchive

https://web.archive.org/web/20250423093911/https://www.nytim...

janalsncm · a year ago
EM directly manipulates the algorithm to suit his interests. Here’s one we know about: https://www.theguardian.com/technology/2023/feb/15/elon-musk...
ruleryak · a year ago
This article does not offer any proof. It's hearsay, from the title saying he "reportedly" forced it, in turn citing a Platformer article that itself also provides no proof and instead accepts the stories from fired engineers as gospel. The platformer article then goes on to say that views still fluctuate wildly, and that it isn't in line with a supposed 1000x boost. The same Platformer article then says that they believe the supposed 1000x boost is no longer in effect, but they guess something else must be in place. The Guardian article doesn't bother to mention that part.
hashstring · a year ago
Agree about meat, however, the article still made me think.

> What people see feels organic. In reality, they’re engaging with what’s already been filtered, ranked, and surfaced. Naturally, I— and I think many humans have this too- often perceive comments/content that I see as a backscatter of organic content reflecting some sort of consensus. Thousands of people replying the same thing surely gives me the impression of consensus. Obviously, this is not necessarily the truth (and it may be far from it even). However, it remains interesting, because since more people may perceive it as such, it may become consensus after all regardless.

Ultimately, I think it’s good to be reminded about the fact that it’s still algorithmic knobs at play here. Even if that’s something that is not news.

cogitovirus · a year ago
really appreciate the constructive criticism — you're totally right, i should’ve highlighted the sources more clearly.

there were three main pieces of actual evidence i leaned on:

1. the nytimes story on account silencing, https://www.nytimes.com/interactive/2025/04/23/business/elon...

2. the visible boost of low-value tweets (though i should have connected to the x api hose and quantify the data),

3. this paper — https://github.com/timothyjgraham/AlgorithmicBiasX/blob/3f4c...

which actually didn’t make it to the the post because the essay was basically finished by the time i remembered i had it. I should have included it. it shows an algo change that boosts elon’s reach specifically.

every other source was speculative:

superbowl fiasko - https://www.theguardian.com/technology/2023/feb/15/elon-musk... - little evidence

elon forced 100 engineers to boost his tweets ? - hearsay

there are also supposed whistle blowers - https://substack.com/@theconcernedbird/p-154577954 - but again. no evidence.

And the algorithm is still closed source.

i’m sorry the writing beyond the graph didn’t land for you.

mr gruez - I'll do better next time.

-- the author

Fidelix · a year ago
They are upvoting because they hate Elon Musk. It's not that deep.

Dead Comment

MaxPock · a year ago
X is once again full of bots selling crypto and financial services .
sojsurf · a year ago
I went back recently. Maybe I'm in the wrong circles, but I'm seeing neither of these.

I _am_ still seeing lots of recycled content looking for clicks.

stetrain · a year ago
It never really stopped. All of Elon's crying about bots stopped as soon as he took ownership.
josefritzishere · a year ago
It's really degenerated into a trash heap. I quit years ago.
mmastrac · a year ago
I pop in from time to time but I only ever see right-wing rage bait (??) and my old timeline is completely gone. I don't engage with any of it either, just scrolling until I finally catch a name I recognize and maybe dropping a like.
2OEH8eoCRo0 · a year ago
He fixed the bot problem, the problem of Twitter banning Elon's bots.
arrowsmith · a year ago
Hey, that's not fair! It's also full of porn bots and Holocaust denial.
jjeaff · a year ago
Don't forget the graphic fight videos with comments full of racial undertones. I have literally never engaged with nor watched more than a few seconds of those types of videos yet my feed is full of them.
dismalaf · a year ago
Every platform has Holocaust denial because that's the one thing that the far-left and far-right both agree on...
w10-1 · a year ago
Is the title ironic? Is this helping?

"Manufacturing consent", the book by Chomsky and Herman, details techniques that are largely unused in this situation. Chomsky's book by disclosing the hidden editor works against the effect rather than for it.

Here it's closer to a state-run media outlet, with the exact ambiguity that implies: a known editor pretending to be objective, except here the editor only really cares about certain topics, and others are encouraged to roam freely (if traceably).

In Chomsky's case, the editor's power comes from being covert, but only if people are fooled, so the book works to diminish it. In this case, the power comes from the editor being known unstoppable. You have to accept it and know yourself as accepting it -- which means you have to buy in as a fan to avoid seeing yourself as herded, or out yourself as an outsider. Since most people take the default step of doing nothing, they get accumulating evidence of themselves as herded or a fan. It's a forcing function (like "you're with us or against us") that conditions acceptance rather than manufacturing consent.

In this case, articles (showing what happens when you oppose the editor) and ensuing discussions like this (ending in no-action) have the reinforcing effect of empowering the editor, and increase the self-censuring effects. They contribute to the aim and effect of conditioning acceptance. So they might not be helpful. (Better would be the abandonment of a platform when it fails to fulfill fairness claims, but that's hard to engineer.)

notfed · a year ago
"Manufacturing consensus" is, at least, the title of a separate book more appropriately fitting the theme of the article.
627467 · a year ago
The article’s angst over X’s “manufactured consensus” is overblown. Influence has always been curated—editors, town criers, or whoever grabbed the mic were the analog algorithms. X’s sin isn’t some evil algo: it’s just running at planetary scale. We’ve ditched thousands of small communities for one global shouting match, so naturally mega-influencers steal the show. Algorithms are just the gears keeping this chaos moving because we crave instant, worldwide chatter. Some folks pretend a perfect algorithm exists (bsky, IG/fb) but it doesn’t come from one team, one database, or one set of criteria. The “perfect” system is a messy web of different algorithms in different spaces, barely connected, each with its own context. Calling out X’s code misses the mark. We signed up for this planetary circus and keep buying tickets.
throwaway7783 · a year ago
But there is no denying that there is a shift in narrative in X posts since its acquisition. So there is certainly more going on than just planetary scale. It was planetary scale before acquisition too. Algorithms have the power to nudge the narrative one way or another at planetary scale.
ljsprague · a year ago
It was completely transparent and unbiased before its acquisition.
EcommerceFlow · a year ago
Yes, the natural order (that was mass censored for 10+ years) got uncensored. Look at who won the presidency.

Deleted Comment

Dead Comment

qnleigh · a year ago
Do we know that this is how the algorithm actually works? The article only shows one plot of one specific instance, and there could be more than one explanation for the sudden drop in viewership (especially given the involvement of Twitter's owner).
jsheard · a year ago
> Do we know that this is how the algorithm actually works?

Funnily enough we should know that, since Elon promised to open source the algorithm in the name of transparency. But what actually happened is they dropped a one-time snapshot onto GitHub two years ago, never updated it, and never acknowledged it again. Many such cases.

jandrese · a year ago
It was enough info that people who professionally post on X/Twitter can play the algorithm like a fiddle. They can get anything they want to the top, and often can even get Elon to re-tweet it.
hashstring · a year ago
Yes, this 100%.

And never forgot the, isElon boolean var that would increase post visibility. lol, what a shame.

a2128 · a year ago
Below the chart there's a link to a NYTimes article it was sourced from, which has more plots of more instances of this
janalsncm · a year ago
Their “for you” feed is engagement bait. In other words, it appears to be running almost entirely on CTR. It seems to pull from a pool of posts that are engaged with by those you follow. Limit seems to be 24h.

It’s not a very sophisticated algorithm, likely because the best people aren’t super keen on working there for WLB reasons.