I spent a lot of my life and money thinking about building better algorithms (over five years).
We have a bit of a chicken / egg problem. Is it the algorithm or is it the preference of the Users which is the problem.
I'd argue the latter.
What I learned which was counter-intuitive was that the vast majority of people aren't interested in thinking hard. This community, in large part, is an exception where many members pride themselves on intellectually challenging material.
That's not the norm. We're not the norm.
My belief that every human was by their nature "curious" and wanting to be engaged deeply was proven false.
This isn't to claim that this is our nature, but when testing with huge populations in the US (specifically), that's not how adults are.
The problem, to me, is deeper and is rooted in our education system and work systems that demand compliance over creativity. Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.
> Is it the algorithm or is it the preference of the Users which is the problem. I'd argue the latter.
> Algorithms serve what Users engage with
User engagement isn't actually the same thing as user preference, even though I think many people and companies take the shortcut of equating the two.
People often engage more with things they actually don't like, and which create negative feelings.
These users might score higher on engagement metrics when fed this content, but actually end up leaving the platform or spending less time there, or would at least answer in a survey question that they don't like some or most of the content they are seeing.
This is a major reason I stopped using Threads many months ago. Their algorithm is great at surfacing posts that make me want to chime in with a correction, or click to see the rest of the truncated story. But that doesn't mean I actually liked that experience.
> Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.
Algorithms have been adapted; they are successful at their goals. We’ve put some of the smartest people on the planet on this problem for the last 20 years.
Humans are notoriously over-sensitive to threats; we see them where they barely exist, and easily overreact. Modern clickbait excels at presenting mundane information as threatening. Of course this attracts more attention.
Also, loud noises attract more attention than soft noise. This doesn’t mean that humans prefer an environment full of loud noises.
I don’t know if I buy the explanation that this was due to the feed algorithm. It looks like an artifact of being exposed to X’s current user base instead of their old followers. When Twitter switched to X there was a noticeable shift in the average political leanings of the platform toward alignment with Musk, as many left-leaning people abandoned the platform for Bluesky, Mastodon, and Threads.
So changing your feed to show popular posts on the platform instead of just your friends’ Tweets would be expected to shift someone’s intake toward the average of the platform.
Is this the result of a feedback loop from musk joining or did they just accelerate the overall decline of the platform with him joining? Some might say it was going this way even before he picked it up, but it was certainly an inflection point when he joined either way.
All modern social media is pretty toxic to society, so I don't participate. Even HN/Reddit is borderline. Nothing is quite as good as the irc and forum culture of the 2000s where everyone was truly anonymous and almost nobody tied any of their worth to what exchanges they had online.
The moderation changes absolutely changed posting behavior. People got banned for even faintly gesturing the wrong direction on many issues and it frightened large accounts into toeing the line.
It's the proliferation of downvoting. It disincentivizes speaking your honest opinion and artificially boosts mass-appeal ragebait.
It's detrimental to having organic conversations.
"But the trolls" they say.
In practice it's widely abused.
Using HN as an example, there are legitimate textbook opinions that will boost your comment to the top, and ones that will quickly sink to the bottom and often be flagged away for disagreement. Ignoring obvious spam which is noise, there is no correlation to "right" or "wrong".
That's one advantage old-school discussion forums and imageboards have. Everyone there and all comments therein are equally shit. No voting with the tribe to reinforce your opinion.
What's worse is social media allowed the mentally ill to congregate and reinforce their own insane opinions with plenty of upvotes, which reinforces their delusions as a form of positive feedback. When we wonder aloud how things have become more radicalized in the last 20 years — that's why. Why blame the users when you built the tools?
I'm not sure what your point is. How is "being exposed to X's current user base instead of their old followers" not equivalent to "turning on the feed algorithm"? You doubt the effect is due to the algorithm, but your alternative explanation describes exactly what the algorithm does.
I don't know what changes have been made more recently, but I know there was a definite change to the Twitter algorithm a few months ago that filled the feeds of conservatives with posts from liberals and vice versa. It seemed to be specifically engineered to provoke conflict.
> When Twitter switched to X there was a noticeable shift in the average political leanings of the platform toward alignment with Musk, as many left-leaning people abandoned the platform for Bluesky, Mastodon, and Threads.
Do you have any numbers. In my experience it's still majorly communists
I deleted my account after many years when X recently made the Chronological Feed setting ephemeral, defaulting back to the Algorithmic Feed each time the page is refreshed.
No away I'm going to let that level of outrage-baiting garbage even so much as flash before my eyes.
I just click "following" at the top and never see anything I didn't ask to see. It resets once every few months to the other tab which I assume is just the cookie setting expiring.
Train it: I just have to spend 3 minutes every other year to tap the 3 dots on every post and choose "Not Interested", for an epic feed unmatched anywhere.
The uncomfortable truth to most "the algorithm is biased" takes is that we humans are far more politically biased than the algorithms and we're probably 90% to blame.
I'm not saying there is no algorithmic bias, and I tend to agree the X algorithm has a slight conservative bias, but for the most part the owners of these sites care more about keeping your attention than trying to get you to vote a certain way. Therefore if you're naturally susceptible to cultural war stuff, and this is what grabs your attention, it's likely the algorithm will feed it.
But this is far more broad problem. These are the types of people who might have watched political biased cable news in the past, or read politically biased newspapers before that.
the issue brought up in the article isn't that "the algorithm is biased" but that "the algorithm causes bias". A feed could perfectly alternate between position A and position B and show no bias at all, but still select more incendiary content on topic A and drive bias towards or away from it.
I have the same thought, my X algo has become less political than HackerNews. I suppose it depends on how you use it but my feed is entirely technical blogs, memes, and city planning/construction content
I've been pretty consistent about telling Bluesky I want to see less of anything political and also disciplined about not following anybody who talks about Trump or gender or how anybody else is causing their problems. I see very little trash.
Yeah, this was always the play looking back in hindsight. Like, I didn't get it, "why would you pay that kind of money for a web forum?!" It wasn't the forum that was important, Twitter (for better or worse) has wormed it's way into the fabric of American discourse. He was basically buying the ideological thermostat for the country and turning the dial to the right.
as opposed to the government funding turning it to the reality bending left? There was direct communication from Senators and members of Congress directing twitter to block and ban based on certain topics. And Twitter obliged.
I didn't get it either until I trained the algorithm to feed me what I want by just clicking the three dots and selecting Not Interested on anything I never wanted to see again... it listens, whats left is really unmatched anywhere, I've really looked, and occasionally still do out of curiosity.
Lots of info is shared there first. It shows up in news articles and podcasts 12-24 hours later. Not everything shared there is true, of course, so one has to do diligence. But it definitely surfaces content that wouldn't show up if I just read the top 2-3 news websites.
What does it mean to have someone on a chronological feed, versus the algorithmic one? Does that mean a chronological feed of the accounts they follow? I hardly ever use that, since I don't follow many people, and some people I follow post about lots of stuff I don't care about
from the study:
> We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks
I don't follow people who post about lots of stuff I don't care about. I follow hashtags instead, which gives me a much higher signal to noise ratio than following those people.
I thought hashtags were dead. Does this surface tweets that just have the word, not the hashtag? Can you follow multiple words using boolean operators?
"We need more funding into open protocols that decentralize algorithmic ownership; open platforms that give users a choice of algorithm and platform provider; and algorithmic transparency across our information ecosystem."
This sounds like a call to separate the aggregation step from the content. Reasonable enough, but does it really address the root cause? Aren't we just as polarized in a world where there are dozens of aggregators into the same data and everyone picks the one that most indulges their specific predilections for engagement, rage, and clicks?
What does "open" really buy you in this space?
Don't get me wrong, I want this figured out too, and maybe this is a helpful first step on the way to other things, but I'm not quite seeing how it plays out.
I will note that political extremists can have more interesting content at times, and it’s good to see what they are up to in case it will affect you. They also sometimes surface legitimate stories that are kept out of the mainstream press, or which are heavily editorialized or minimized there. But you definitely have to view all sides to gain an accurate view here, it would be a mistake to read only one group of extremists. And it’s almost always a mistake to engage with any of them.
I spent a lot of my life and money thinking about building better algorithms (over five years).
We have a bit of a chicken / egg problem. Is it the algorithm or is it the preference of the Users which is the problem.
I'd argue the latter.
What I learned which was counter-intuitive was that the vast majority of people aren't interested in thinking hard. This community, in large part, is an exception where many members pride themselves on intellectually challenging material.
That's not the norm. We're not the norm.
My belief that every human was by their nature "curious" and wanting to be engaged deeply was proven false.
This isn't to claim that this is our nature, but when testing with huge populations in the US (specifically), that's not how adults are.
The problem, to me, is deeper and is rooted in our education system and work systems that demand compliance over creativity. Algorithms serve what Users engage with, if the Users were to no longer be interested in ragebait, clickbait, focused on thoughtful content -- the algorithms would adapt.
> Algorithms serve what Users engage with
User engagement isn't actually the same thing as user preference, even though I think many people and companies take the shortcut of equating the two.
People often engage more with things they actually don't like, and which create negative feelings.
These users might score higher on engagement metrics when fed this content, but actually end up leaving the platform or spending less time there, or would at least answer in a survey question that they don't like some or most of the content they are seeing.
This is a major reason I stopped using Threads many months ago. Their algorithm is great at surfacing posts that make me want to chime in with a correction, or click to see the rest of the truncated story. But that doesn't mean I actually liked that experience.
Curious about this. Don't have an angle, just trying to survey your perspective.
You shared: > People often engage more with things they actually don't like, and which create negative feelings.
Do you think this is innate or learned? And, in either case, can it be unlearned.
Algorithms have been adapted; they are successful at their goals. We’ve put some of the smartest people on the planet on this problem for the last 20 years.
Humans are notoriously over-sensitive to threats; we see them where they barely exist, and easily overreact. Modern clickbait excels at presenting mundane information as threatening. Of course this attracts more attention.
Also, loud noises attract more attention than soft noise. This doesn’t mean that humans prefer an environment full of loud noises.
That's not the norm. We're not the norm.
I recommend against putting HN on a pedestal. It just leads to disappointment.
Dead Comment
Dead Comment
So changing your feed to show popular posts on the platform instead of just your friends’ Tweets would be expected to shift someone’s intake toward the average of the platform.
Deleted Comment
All modern social media is pretty toxic to society, so I don't participate. Even HN/Reddit is borderline. Nothing is quite as good as the irc and forum culture of the 2000s where everyone was truly anonymous and almost nobody tied any of their worth to what exchanges they had online.
It's the proliferation of downvoting. It disincentivizes speaking your honest opinion and artificially boosts mass-appeal ragebait.
It's detrimental to having organic conversations.
"But the trolls" they say.
In practice it's widely abused.
Using HN as an example, there are legitimate textbook opinions that will boost your comment to the top, and ones that will quickly sink to the bottom and often be flagged away for disagreement. Ignoring obvious spam which is noise, there is no correlation to "right" or "wrong".
That's one advantage old-school discussion forums and imageboards have. Everyone there and all comments therein are equally shit. No voting with the tribe to reinforce your opinion.
What's worse is social media allowed the mentally ill to congregate and reinforce their own insane opinions with plenty of upvotes, which reinforces their delusions as a form of positive feedback. When we wonder aloud how things have become more radicalized in the last 20 years — that's why. Why blame the users when you built the tools?
Do you have any numbers. In my experience it's still majorly communists
Look at that: https://news.ycombinator.com/item?id=46504404
No away I'm going to let that level of outrage-baiting garbage even so much as flash before my eyes.
I'm not saying there is no algorithmic bias, and I tend to agree the X algorithm has a slight conservative bias, but for the most part the owners of these sites care more about keeping your attention than trying to get you to vote a certain way. Therefore if you're naturally susceptible to cultural war stuff, and this is what grabs your attention, it's likely the algorithm will feed it.
But this is far more broad problem. These are the types of people who might have watched political biased cable news in the past, or read politically biased newspapers before that.
In twitters case, you had regime officials directing censorship illegally through open emails and meetings.
It's no surprise that the needle moves right when you dial back the suppression of free expression even a little bit (X still censors plenty)
from the study:
> We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks
This sounds like a call to separate the aggregation step from the content. Reasonable enough, but does it really address the root cause? Aren't we just as polarized in a world where there are dozens of aggregators into the same data and everyone picks the one that most indulges their specific predilections for engagement, rage, and clicks?
What does "open" really buy you in this space?
Don't get me wrong, I want this figured out too, and maybe this is a helpful first step on the way to other things, but I'm not quite seeing how it plays out.