Readit News logoReadit News
Grim-444 · 4 years ago
The crux of the study seems to be that while, yes, conservatives were definitively censored more often, that it was "deserved" because they were spreading misinformation.

However, how the study defined "misinformation" was via a separate referenced study that basically just asked 1000 people on mechanical turk (and 8 "independent" fact checkers) whether they trusted particular news sources. The sources that were voted by random members of the public as less trustworthy were then classified as "misinformation spreaders".

So there was really no objective evaluation of whether anything the censored conservatives said was actually true or false, real or fake, legitimate or not, hurtful or helpful, or any type of analysis, it just came down to whether they more were often referencing sites that random members of the public hold the belief of being wrongthink spreaders.

dgellow · 4 years ago
Direct link to the preprint, in case you don't care that much about techdirt writing: https://psyarxiv.com/ay9q5
native_samples · 4 years ago
This is one of a recent run of hilarious attempts to claim that anti-conservative bias on social networks (or alternatively, enforcement of left wing orthodoxy), doesn't really exist at all. This one is especially entertaining for the pretzel-like contortions the authors go through.

> it’s become a key part of “the narrative” that social media websites have an “anti-conservative bias” in how they moderate. As we’ve pointed out over and over again there remains little evidence to support this.

From the study they're linking to:

> We then investigated potential political bias in suspension patterns and identified a set of 9,000 politically engaged Twitter users, half Democratic and half Republican, in October 2020, and followed them through the six months after the U.S. 2020 election. During that period, while only 7.7% of the Democratic users were suspended, 35.6% of the Republican users were suspended

It literally shows Republicans getting suspended at drastically higher rates than Democrats but, according to TechDirt, there is no evidence of any anti-conservative bias anywhere. It's just a fantasy. Because, see, social media platforms aren't biased against conservatives, just people who are 'wrong' about politics, which is totally different! The study is exactly the kind of ideologically biased pseudo-science that makes the social sciences such a laughing stock. It's just a giant exercise in circular reasoning. Because social networks don't literally say they're banning people for ideological reasons, nobody is banned for ideological reasons. That's the study, in a nutshell.

multjoy · 4 years ago
You seem to ignore the possibility that those 35.6% deserved those suspensions.
native_samples · 4 years ago
Ignored, because neither article nor study provides any logical reason to believe that. The study is not a neutral study of misinformation (they never are). Instead it defines misinformation as anything wacky that only Republicans believe, then uses that as a justification for erasing them from the platforms.

The equivalent in the other direction would be if Elon Musk took over Twitter and then banned everyone who claimed at any point that lockdowns worked because they were spreading misinformation, whilst completely ignoring claims that COVID doesn't exist at all. It's absolutely truly that this is misinformation - just look at the total failure in Shanghai or the conspicuously missing evidence of impact from western lockdowns - but it would disproportionately affect Democrats if that was the only component of the definition.

mikehodgson · 4 years ago
Have you considered that, especially immediately after the election, one side of the political spectrum was propagating much more misinformation than the other?
native_samples · 4 years ago
Yes, considering that it's the entire thesis of the study and how they try to claim there's no ideological bias.

Do I really have to explain why this is ridiculous on HN? No censorship regime has ever said "we are censoring this because we're ideologically opposed to it". There are always other justifications, often paper thin but they exist. A common excuse is that anyone making the government look bad is "spreading rumours", for example (China uses this one a lot). In the west, it's that exactly the same except they use the word misinformation rather than rumours.

The study in question is nonsensical - like all other such studies - because it claims that people are getting banned for spreading "misinformation" and not being conservative, but doesn't have any rigorous definition of what misinformation is. Instead they asked a bunch of self-proclaimed "fact checkers" (i.e. the people tasked with enforcement of ideological orthodoxy), and as a backup measure picked QAnon and said, that's misinformation (all of it).

Is QAnon misinformation? Yes, it is. Nonetheless it's obviously not a complete definition of the problem. I'm not American but I recall very well that after Trump won that for years there was a massive, organized misinformation campaign claiming that Trump was a secret Russian agent. If they'd included that particular conspiracy theory into their definition they'd have found that there were lots of Democrats spreading misinformation too, but they didn't, because that would have defeated their goal (the production of ideological propaganda useful for political talking points like the one you just raised).

If you're going to try and claim your political opponents are generically less honest than you are in politics, that's one thing. But when people who claim to be scientists do it, and they use the exact same tactics, that just degrades science. It's not actual research of the sort that arises from some coherent theory of the world and which can be neutrally tested. It's simply "how can we prove that Republicans are evil today?".

car_analogy · 4 years ago
> This, the observation that Republicans were morel likely to be suspended than Democrats provides no support for the claim that Twitter showed political bias in its suspension practices. Instead, the observed asymmetry could be explained entirely by the tendency of Republicans to share more misinformation.

Interesting - a difference in outcomes that is not caused by bias, but by different behavior. Could this explain other apparent "biases"? Such as hate speech suspensions [1]? Or police shootings [2]?

[1] https://www.washingtonpost.com/technology/2021/11/21/faceboo...

[2] https://www.clarionledger.com/story/opinion/columnists/2016/...

multjoy · 4 years ago
>Interesting - a difference in outcomes that is not caused by bias, but by different behavior.

How is that interesting? You only have to look at the type of content shared by right vs left to realise that the 'bias' against the right is the result of shitty content.

It's the same argument used by right wing comedians, who fail to realise that they're just not that funny.

incomingpain · 4 years ago
I wanted to test this myself after Elon bought in and had a poll of free speech.

I created an account and connected to canadian conservatives and chess. 2 of my interests. I never commented or retweeted anything. I would just like stuff. Then 1 day I saw a comment by MVL and he was kind of wrongly excluded from picture. I felt bad and told him he's awesome. I was censored. I see my comment normally but from 'private window' it's labelled offensive content.

This however makes some sense. I think all medicine should be free choice. Government can't ever coerce you into getting medication or prevent you from getting medicines. This however makes me connect to a political party that is against vaccine mandates. Which twitter attributes to being misinformation.

I can see what happened here but boy do I ever not agree with that. Elon fixing this will be a good thing.

jdrc · 4 years ago
Study: studies of misinformation are biased against conservatives.

Not that i care either way but both sides can lie with statistics