A lot of what's being currently said about "disinformation" is completely incoherent and makes little sense if you just track it over time. As late as in 2014 social media was presented as a force for good. ("Wisdom of the crowds", etc.) Just couple years later it was rebranded as some kind of machine where evil people brainwash idiots into becoming evil people. Who are the idiots? Well, given that YouTube and Facebook have billions of users, it seems most humans on the planet are shoved into that category now. Everything they see and say needs to be carefully curated by professionals, lest the crowds go mad.
Meanwhile, what are the incentives to, say, post an intelligent and comprehensive YouTube comment? There were none in 2014 and there are none right now. The UI, the up-voting process, the very (lack of) structure in how comments are displayed and sorted all encourage verbal vomit, and that's exactly what YouTube gets. In general, social media structure usually provides zero incentives to engage in real discussions and deep thinking. I don't see this being addressed, let alone changed, no matter what the executives say.
I recommend everyone to read Marshall McLuhan's Understanding Media. Despite being written in 1964 it presents a much more useful analytical framework to really understand social media than anything I've seen published in the last few years.
I've seen some informal analysis that claims that the ubiquity of smartphones and the successful transition of the massive social networks to mobile is what specifically led to many of the alleged negative societal effects (increased political polarization, for example, although there are many other examples). The thinking goes that it's the all-day thumb-scrolling addiction loop on a tiny device that leads to these negative outcomes even more so than what was supposedly already happening with the pre-smartphone rise of social media.
Your mention of 2014 might be compatible with this line of thinking. Facebook famously abandoned its mobile HTML5 stack and "went all in" on mobile in 2012. They also acquired Instagram in late 2011, and WhatsApp in 2013.
I've seen some informal analysis that claims that the ubiquity of smartphones and the successful transition of the massive social networks to mobile is what specifically led to many of the alleged negative societal effects (increased political polarization, for example, although there are many other examples).
Of course you've seen such analysis, it's fairly ubiquitous.
But as the parent and the article argue, it's completely incoherent. Not that phones haven't given polarization and propaganda a bit of a push but because these things are just parts of long-term trends that need to be looked at. And a lot of the "social media opens people to evil" complaints came loudly from mainstream media who were understandably upset at loosing their semi-monopoly trend-setting (ie, propaganda).
Of course, a lot of the forces that effectively hacked social media were the extreme right, which I'd hardly a fan of. But this wasn't "a sudden rise in propaganda" but a relative democratization of propaganda. If you want a non-propagandistic way of disseminating information, you need to go much further back than 1990, probably look at a whole different method of communicating.
One other way to think about it is that a massive content fork started happening around the time of the internet.
In the "beginning", as the article states, there was the big 3 networks, and a bunch of fringe radio programs, and movie theatres, and newspapers. Broadly highly consolidated. A bit more emphasis on local things.
Then there was cable. You were certainly going to get more perspectives, but your age group is probably going to still only watch a few channels. And still radio and newspapers, and movies.
Now there's youtube, twitch, cable, crunchyroll, radio, podcasts, social media. 10,000 channels of content. You as an individual mix and match based on your own preferences and interests. Its also available everywhere. As is conversing with friends.
It seems like pure logic that it is much harder to create a consensus in a winner take all political system.
> The UI, the up-voting process, the very (lack of) structure in how comments are displayed and sorted all encourage verbal vomit, and that's exactly what YouTube gets. In general, social media structure usually provides zero incentives to engage in real discussions and deep thinking. I don't see this being addressed, let alone changed, no matter what the executives say.
I think you might be literally the first person I've ever encountered who has touched on this, whereas, whenever I mention the possibility, any response I get is disagreement or negative. Very often, people "know" that it is not the problem, yet oddly have a very strong aversion to sharing how it is they know that.
I think it would be kind of hilarious if this was in fact (but unknown) one of the root causes of our problems, but no one has the ability to even consider the idea.
I thought it was well known that the algorithms optimise for stress and anger, because it drives more engagement. I think it might even be involuntarily - the algorithms may have been set to train for enhanced engagement, and hate and anger happen to be the solution.
> I think you might be literally the first person I've ever encountered who has touched on this, whereas, whenever I mention the possibility, any response I get is disagreement or negative.
So they give you a downvote, a quippy snarky Tweet in response to disagree with you that there is a lack of thoughtful, deep thinking about topics.
I'm pretty sure this is the joke in multiple John Oliver web segments [0]. It's pretty well-understood that the YouTube design is not conducive to long-form discussion.
> As late as in 2014 social media was presented as a force for good. ("Wisdom of the crowds", etc.) Just couple years later it was rebranded as some kind of machine where evil people brainwash idiots into becoming evil people. Who are the idiots? Well, given that YouTube and Facebook have billions of users, it seems most humans on the planet are shoved into that category now
I don't think that's true, because from what I remember there is less than 1% of viewers who leave comments.
So the "idiots" I guess would be small vocal minorities. The question is how far their influence goes?
But it's very possible that a ton of people don't engage in the comments and discussions on social media, because they don't find the design of it suitable for intelligent discourse, and maybe that leaves you with only idiotic comments.
> In general, social media structure usually provides zero incentives to engage in real discussions and deep thinking. I don't see this being addressed, let alone changed, no matter what the executives say.
That makes me think of the Russian math culture. Imagine if there was some way to create incentives via some Social Media NG to encourage that kind of thoughtful discourse.
What if what drives YouTube profits isn't necessarily substance, but engagement (to a first-order approximation)?
Perhaps the comments are already optimized, each as a sort of micro-clickbait? There seem to be a lot of users who are highly motivated to participate in the "U! No U!! NO U!!!!" back and forth.
Since 2014 the subscribers of social network became more aware of the wholesale snooping and privacy violations that are hidden behind the "terms and condition" of their social networks. I wonder if the whole disinformation narrative isn't in part an attempt to deflect public attention from the issue of privacy violations, an attempt to present the social network services as protectors of the realm, of finding some indirect justification and relativization for the business practices of these companies.
I mean you might get a sense who is pushing the argument, if you think of who is to gain from it. The article lists Alex Stamos, formerly Facebook’s chief security officer as a member of the Aspen institute 'Commission on Information Disorder', among others.
The article also says that 'it's all theater', but is drawing some very different conclusion; the 'disinformation' argument is supposed to support the claim that targetted advertising is actually working: "
Ironically, to the extent that this work creates undue alarm about disinformation, it supports Facebook’s sales pitch. What could be more appealing to an advertiser, after all, than a machine that can persuade anyone of anything? This understanding benefits Facebook, which spreads more bad information, which creates more alarm. Legacy outlets with usefully prestigious brands are taken on board as trusted partners, to determine when the levels of contamination in the information ecosystem (from which they have magically detached themselves) get too high. For the old media institutions, it’s a bid for relevance, a form of self-preservation. For the tech platforms, it’s a superficial strategy to avoid deeper questions. A trusted disinformation field is, in this sense, a very useful thing for Mark Zuckerberg."
Also the article mentions that for the political class 'disinformation' is a way to explain away their own failings "the Establishment needs the theater of social-media persuasion to build a political world that still makes sense, to explain Brexit and Trump...A common account of social media’s persuasive effects provides a convenient explanation for how so many people thought so wrongly at more or less the same time. More than that, it creates a world of persuasion that is legible and useful to capital—to advertisers, political consultants, media companies, and of course, to the tech platforms themselves. It is a model of cause and effect in which the information circulated by a few corporations has the total power to justify the beliefs and behaviors of the demos. In a way, this world is a kind of comfort"
Different people are drawing different conclusions from the same data, fascinating...
I’m saddened by the harsh reaction so far, which seems based mostly on the first couple of paragraphs of a thoughtful and well-written essay. The discussion of online persuasion being used by both advertisers and spreaders of disinformation, and the many incentives that exist for different parties to accept that “social media” is an all-powerful manipulation machine, is particularly interesting.
I like this in part because the conclusion is strangely optimistic - we have more power than we think, if only we can recognize it.
The first out of the gate on new submissions are the knee-jerk crazies who didn't read the article, and who want to push their agenda instead of thinking about and discussing things.
Comment quality improves over time. As your own comment reassuringly demonstrates.
> the many incentives that exist for different parties
Someone(s) should update the Five Filters models from Manufacturing Consent to account for social medias. Compare the social network media ecosystem to the prior broadcast and print medias.
Here's my stab at it:
The Five Filters are Owners, Advertisers, Sources, Flak, and War.
Owners (social networks) took most of Advertisers' economic power for themselves, flipping that power relationship. With loss of status, Advertisers' power to control dialog and shape opinion largely disappeared.
To reduce costs, prior Owners debased Sources by replacing news with infotainment and drama/comedy with reality TV. The current Owners reduced costs even further by making the audience their own Sources. Trolling, conspiracy, karma, gossip, outrage IS the new content. Genius.
While social media Owners became the biggest economic winners, Flak became the cultural winners, displacing Advertisers. Social media eliminates provenance (authenticity) by laundering (disintermediation) content. Flak now enjoys impunity that Advertisers could only dream of.
For lack of a well defined enemy, War turned us against each other.
--
Two aspects of the rise of social media confuse me.
Why haven't Advertisers revolted? Owners and Flak continues to steal their lunch money, and they just take it.
The battle lines for the free speech haven't been updated for social media. New lines had to be drawn with the advent of broadcast media. (Duh.) No one anticipated the function and impact of algorithmic recommenders. Total game changer. So we should recognize and accept the new reality and update Section 230 accordingly.
Of course, Flak benefits most from this willful blindspot, and is best able to shape the dialog, to better defend their spoils.
Owners will oppose any change by default, because why not? The status quo is pretty terrific.
>I’m saddened by the harsh reaction so far, which seems based mostly on the first couple of paragraphs of a thoughtful and well-written essay. The discussion of online persuasion being used by both advertisers and spreaders of disinformation, and the many incentives that exist for different parties to accept that “social media” is an all-powerful manipulation machine, is particularly interesting.
I have read more of it that that, and I found it to be lacking in original thinking, and poorly written. As for the effectiveness of persuasion, this has been studied quantitatively for quite some time now, and it can't be hand-waved away.
Persuasion on social media is certainly effective. But I think it’s also worth reflecting on the number of people who would still believe false news stories/conspiracy theories/etc in a hypothetical world without social media, and what techniques we might use in that world to combat the issue.
As for original thinking, I may not be as well-read on these topics as others. Some of the ideas seemed obvious once I read them, but I didn’t think I’d seen them before. Example: “disinformation” becoming a catch-all word for “things I disagree with”.
It would help if the author refrained from poor framing and abusing the trope of the blithe self-centered American presented as if we were the only ones culpable of such infraction.
Let’s have a 360 review, I’ll taint it by first talking about a bunch of bad stuff everyone has done, but I’ll pin it on you for this review, after I’ve said a bunch of bad stuff, I’ll redeem you a little by offering some hope, how does that sound?
This article could have been 1/3rd as long and contained way more data. Overall, I agree that online ads and disinformation have been give a god like aura that is not deserved.
As a percentage of views on platforms like facebook the amount of outright misinformation is miniscule, 10s of millions vs the 100s of Billions. Now if you deign misinformation as “information I disagree with/dont like” then sure that number goes up.
There was no “black magic” at Cambridge Analytica, those guys were idiots with data they didn’t understand selling promises they couldn’t deliver.
FB and Google are not “grimly secretive” compared to other F500s or say Apple? They are pretty transparent and their employees are still notoriously loose lipped. Also there has always been “yellow journalism”.
I agree that the fight against disinformation is likely to be worse than the disinformation itself.
The mainstream media blasted the entire populace with a cold war, KGB thriller, pee tape conspiracy theory for 4 years straight. It wasn't just cable news, it was NYT/WSJ/WashPo piling onto the misinformation.
In the end it turns out there was much hoopla about nothing, a giant psyop by the establishment powers that now want to earn your trust back.
I think that actually points towards the issue of social media. Even without explicit collaboration from US actors, Russian actors are able to interfere in elections and other processes through reaching a large US populace on social media.
Prior media would have been a lot harder to coerce by foreign agents. New social media are way easier to take advantage of as a foreign power.
For example, I'm able to participate in this debate without even being American. I have new reach and influence over Americans I never had before. How many people commenting here aren't American? Half the comments could be from Russians all we know.
That Trump didn't ask for their help doesn't mean that he didn't play into their hand.
In a strange way though, you can't dissociate them anymore. He could very well have won all by himself, hacked email or not, fake accounts or not, etc. But they did hack and leak her emails, and they did create fake accounts, and had troll farms targeted at American voters.
I can see the hesitancy to acknowledge that, because it could discredit your own win, but it's also a very interesting new scenario that didn't exist before, which is that foreign powers didn't have direct means of communication with such a large portion of your citizenry as we do now.
If I were to speak for the past, it was actually the US media that were one of the few to be able to reach into other countries populace through movies, books, games, and all that. But it was very hard for non-US media to reach Americans, and that's no longer the case.
>Half the comments could be from Russians all we know.
If they're any good at what they do… :)
This HN thread is basically ground zero for pushing the 'Russiagate' set of arguments: namely, everything from Trump to QAnon and the antivaxxers are all totally organic and Russia didn't do any of it, in spite of their known tactics for seeding disparate opinion groups with chaos and disinformation to produce a state known as 'the Zone' where nobody knows what, if anything, is true or who is real anymore.
I've got to hand it to them, it is one hell of a tactic. It's somewhat less effective if you know it's one of their tactics. Rather than producing despair, it's downgraded to more 'fog of war' tactics. If you don't know about the tactic, despair is a more likely reaction.
1. Russian population is smaller than US (much smaller than population of all NATO countries).
2. Most people in Russia do not speak English or do it poorly. Russian accent is very hard to hide even in writing. E.g. we don't have articles in our language, the whole concept of article is so foreign to us, that I, for example, just put them randomly, because even after years I can't understand why and where to use them. So it's easy to spot Russian comments.
3. There wasn't a single proof that Russia meddled in US elections, not even indirect evidence. It's mostly a bombardment of US media: "Russians did it" (and 100500 pages of gibberish without a single piece of evidence). Though there were no proofs, most people read only headlines and start to assume there was actual evidence. There wasn't any.
Also, I don't understand why would Russia support Trump. Any US politic hates Russia the same, be it democrat or republican. US is a fascist state waging fascist wars over resources (Vietnam, Iraq, Libya, Syria) that hates almost all the world because it considers itself a chosen nation and all the rest are savage nations. US inherently hates Russia. Always hated - read newspapers from 19th and early 20th century - the hatred was already there. Always will hate (just read comments under any news on Reddit /r/worldnews - many commenters say that Russia should be nuked, that Russians should be killed, etc).
Why would we care to elect Trump? Did he do anything good for Russia? No.
Actually, he appears to be a sane person unlike, e.g. Biden, so in our interest would be to elect a leader with dementia instead.
True but I think you might be getting downvoted because that fails to account for the time leading up to the election (and Brexit I guess) which is what a lot of the people commenting on this subject are interested in.
The article's author repeatedly presumes that disinformation is only coming from the right. This discredits the article, given that the author is apparently unaware of all the disinformation coming from the left.
I can't see how you read this article that way. For example, how do you read this section?
> A quick scan of the institutions that publish most frequently and influentially about disinformation: Harvard University, the New York Times, Stanford University, MIT, NBC, the Atlantic Council, the Council on Foreign Relations, etc. That the most prestigious liberal institutions of the pre-digital age are the most invested in fighting disinformation reveals a lot about what they stand to lose, or hope to regain.
[...] However well-intentioned these professionals are, they don’t have special access to the fabric of reality.
To me, the author is being clear: the "big disinfo" institutions which are claiming that disinformation is a huge problem are overwhelmingly on the left, and they are scrounging for scraps of evidence. The author seems deeply skeptical in an interesting way.
Speaking of other disinformation sources means you're part of the Vast Right Wing Conspiracy, tho. Probably directly employed by the Koch foundation under the direction of Putin.
Disinformation aside, let's just consider for a second that people may be simply consuming information that they want to hear, rather than taking in information that the media thinks or wants them to hear.
Isn't it all the same really - people want to hear narratives that confirm their bias. Internet/modern communications makes it easy for media and politicians to both pick up on that and reach out to the right group of people with the targeted misinformation.
People are too much in love with their beliefs - critical thinking problem.
The problem I see is one "alternate reality" starts to censor another "alternate reality".
It's just a new form of warfare. The war for control of your mind and the shaping of beliefs about reality.
A world where truth isn't part of the information or power structure means that He which controls the narrative controls reality. Arguably more terrifying that some false options and beliefs in the world.
I'm not so sure it's specifically that always, maybe initially.
Perhaps someone has a strong opinion about a thing. They -know- they are right, and every news source is wrong. Until they find one also right. So far, we agree, that's what they want to hear.
But they keep turning to this source, who is spreading wild theories and untruths in other facets. But they were so right about issue X, they must be right about this too!
I think this pretty much sums up why so many 'conservatives' ended up unvaccinated. I cannot imagine that they set out wanting to hear that vaccines are unsafe or will give you microchips. Something/someone they trusted led them down this awful path.
As much as I support free speech, the last few years have really shown me the dangers of it, as well. But how can you regulate? Any arbiter is going to have a bias.
I think it’s likely that ‘conservatives’ (and New Age enthusiasts, and others) wanted to hear anti-elitist messages, messages that discredited experts, which anti-vax messaging is congruent with.
Ironically, "Bad News" is exactly the type of clickbaity title that is endemic in the bad news that the article talks about, e.g. "Bad News! You won't believe what happened next!"
A lot of what's being currently said about "disinformation" is completely incoherent and makes little sense if you just track it over time. As late as in 2014 social media was presented as a force for good. ("Wisdom of the crowds", etc.) Just couple years later it was rebranded as some kind of machine where evil people brainwash idiots into becoming evil people. Who are the idiots? Well, given that YouTube and Facebook have billions of users, it seems most humans on the planet are shoved into that category now. Everything they see and say needs to be carefully curated by professionals, lest the crowds go mad.
Meanwhile, what are the incentives to, say, post an intelligent and comprehensive YouTube comment? There were none in 2014 and there are none right now. The UI, the up-voting process, the very (lack of) structure in how comments are displayed and sorted all encourage verbal vomit, and that's exactly what YouTube gets. In general, social media structure usually provides zero incentives to engage in real discussions and deep thinking. I don't see this being addressed, let alone changed, no matter what the executives say.
I recommend everyone to read Marshall McLuhan's Understanding Media. Despite being written in 1964 it presents a much more useful analytical framework to really understand social media than anything I've seen published in the last few years.
Your mention of 2014 might be compatible with this line of thinking. Facebook famously abandoned its mobile HTML5 stack and "went all in" on mobile in 2012. They also acquired Instagram in late 2011, and WhatsApp in 2013.
Of course you've seen such analysis, it's fairly ubiquitous.
But as the parent and the article argue, it's completely incoherent. Not that phones haven't given polarization and propaganda a bit of a push but because these things are just parts of long-term trends that need to be looked at. And a lot of the "social media opens people to evil" complaints came loudly from mainstream media who were understandably upset at loosing their semi-monopoly trend-setting (ie, propaganda).
Of course, a lot of the forces that effectively hacked social media were the extreme right, which I'd hardly a fan of. But this wasn't "a sudden rise in propaganda" but a relative democratization of propaganda. If you want a non-propagandistic way of disseminating information, you need to go much further back than 1990, probably look at a whole different method of communicating.
In the "beginning", as the article states, there was the big 3 networks, and a bunch of fringe radio programs, and movie theatres, and newspapers. Broadly highly consolidated. A bit more emphasis on local things.
Then there was cable. You were certainly going to get more perspectives, but your age group is probably going to still only watch a few channels. And still radio and newspapers, and movies.
Now there's youtube, twitch, cable, crunchyroll, radio, podcasts, social media. 10,000 channels of content. You as an individual mix and match based on your own preferences and interests. Its also available everywhere. As is conversing with friends.
It seems like pure logic that it is much harder to create a consensus in a winner take all political system.
The Instagram deal was in 2012 and WhatsApp was in 2014. I don't remember when they abandoned HTML5 though, might have been 2012.
I think you might be literally the first person I've ever encountered who has touched on this, whereas, whenever I mention the possibility, any response I get is disagreement or negative. Very often, people "know" that it is not the problem, yet oddly have a very strong aversion to sharing how it is they know that.
I think it would be kind of hilarious if this was in fact (but unknown) one of the root causes of our problems, but no one has the ability to even consider the idea.
So they give you a downvote, a quippy snarky Tweet in response to disagree with you that there is a lack of thoughtful, deep thinking about topics.
[0]: https://www.youtube.com/watch?v=knbw0gJHHBk
I don't think that's true, because from what I remember there is less than 1% of viewers who leave comments.
So the "idiots" I guess would be small vocal minorities. The question is how far their influence goes?
But it's very possible that a ton of people don't engage in the comments and discussions on social media, because they don't find the design of it suitable for intelligent discourse, and maybe that leaves you with only idiotic comments.
That makes me think of the Russian math culture. Imagine if there was some way to create incentives via some Social Media NG to encourage that kind of thoughtful discourse.
Perhaps the comments are already optimized, each as a sort of micro-clickbait? There seem to be a lot of users who are highly motivated to participate in the "U! No U!! NO U!!!!" back and forth.
I mean you might get a sense who is pushing the argument, if you think of who is to gain from it. The article lists Alex Stamos, formerly Facebook’s chief security officer as a member of the Aspen institute 'Commission on Information Disorder', among others.
The article also says that 'it's all theater', but is drawing some very different conclusion; the 'disinformation' argument is supposed to support the claim that targetted advertising is actually working: " Ironically, to the extent that this work creates undue alarm about disinformation, it supports Facebook’s sales pitch. What could be more appealing to an advertiser, after all, than a machine that can persuade anyone of anything? This understanding benefits Facebook, which spreads more bad information, which creates more alarm. Legacy outlets with usefully prestigious brands are taken on board as trusted partners, to determine when the levels of contamination in the information ecosystem (from which they have magically detached themselves) get too high. For the old media institutions, it’s a bid for relevance, a form of self-preservation. For the tech platforms, it’s a superficial strategy to avoid deeper questions. A trusted disinformation field is, in this sense, a very useful thing for Mark Zuckerberg."
Also the article mentions that for the political class 'disinformation' is a way to explain away their own failings "the Establishment needs the theater of social-media persuasion to build a political world that still makes sense, to explain Brexit and Trump...A common account of social media’s persuasive effects provides a convenient explanation for how so many people thought so wrongly at more or less the same time. More than that, it creates a world of persuasion that is legible and useful to capital—to advertisers, political consultants, media companies, and of course, to the tech platforms themselves. It is a model of cause and effect in which the information circulated by a few corporations has the total power to justify the beliefs and behaviors of the demos. In a way, this world is a kind of comfort"
Different people are drawing different conclusions from the same data, fascinating...
Dead Comment
I like this in part because the conclusion is strangely optimistic - we have more power than we think, if only we can recognize it.
Comment quality improves over time. As your own comment reassuringly demonstrates.
Someone(s) should update the Five Filters models from Manufacturing Consent to account for social medias. Compare the social network media ecosystem to the prior broadcast and print medias.
Here's my stab at it:
The Five Filters are Owners, Advertisers, Sources, Flak, and War.
Owners (social networks) took most of Advertisers' economic power for themselves, flipping that power relationship. With loss of status, Advertisers' power to control dialog and shape opinion largely disappeared.
To reduce costs, prior Owners debased Sources by replacing news with infotainment and drama/comedy with reality TV. The current Owners reduced costs even further by making the audience their own Sources. Trolling, conspiracy, karma, gossip, outrage IS the new content. Genius.
While social media Owners became the biggest economic winners, Flak became the cultural winners, displacing Advertisers. Social media eliminates provenance (authenticity) by laundering (disintermediation) content. Flak now enjoys impunity that Advertisers could only dream of.
For lack of a well defined enemy, War turned us against each other.
--
Two aspects of the rise of social media confuse me.
Why haven't Advertisers revolted? Owners and Flak continues to steal their lunch money, and they just take it.
The battle lines for the free speech haven't been updated for social media. New lines had to be drawn with the advent of broadcast media. (Duh.) No one anticipated the function and impact of algorithmic recommenders. Total game changer. So we should recognize and accept the new reality and update Section 230 accordingly.
Of course, Flak benefits most from this willful blindspot, and is best able to shape the dialog, to better defend their spoils.
Owners will oppose any change by default, because why not? The status quo is pretty terrific.
--
Manufacturing Consent's Five Filters https://en.wikipedia.org/wiki/Manufacturing_Consent#Propagan...
The Owners have successfully convinced their subjects that this is a bad idea.
That said, we must tread lightly with the precarious Section 230 and any new modification.
Edit: Owner's are down voting you fast.
I have read more of it that that, and I found it to be lacking in original thinking, and poorly written. As for the effectiveness of persuasion, this has been studied quantitatively for quite some time now, and it can't be hand-waved away.
As for original thinking, I may not be as well-read on these topics as others. Some of the ideas seemed obvious once I read them, but I didn’t think I’d seen them before. Example: “disinformation” becoming a catch-all word for “things I disagree with”.
Let’s have a 360 review, I’ll taint it by first talking about a bunch of bad stuff everyone has done, but I’ll pin it on you for this review, after I’ve said a bunch of bad stuff, I’ll redeem you a little by offering some hope, how does that sound?
As a percentage of views on platforms like facebook the amount of outright misinformation is miniscule, 10s of millions vs the 100s of Billions. Now if you deign misinformation as “information I disagree with/dont like” then sure that number goes up.
There was no “black magic” at Cambridge Analytica, those guys were idiots with data they didn’t understand selling promises they couldn’t deliver.
FB and Google are not “grimly secretive” compared to other F500s or say Apple? They are pretty transparent and their employees are still notoriously loose lipped. Also there has always been “yellow journalism”.
I agree that the fight against disinformation is likely to be worse than the disinformation itself.
It what way?
Misinformation is subjective. There is nobody we can trust to be the arbiter of what is misinformation and what isn’t.
Also the first attempts have failed terribly.
In the end it turns out there was much hoopla about nothing, a giant psyop by the establishment powers that now want to earn your trust back.
Prior media would have been a lot harder to coerce by foreign agents. New social media are way easier to take advantage of as a foreign power.
For example, I'm able to participate in this debate without even being American. I have new reach and influence over Americans I never had before. How many people commenting here aren't American? Half the comments could be from Russians all we know.
That Trump didn't ask for their help doesn't mean that he didn't play into their hand.
In a strange way though, you can't dissociate them anymore. He could very well have won all by himself, hacked email or not, fake accounts or not, etc. But they did hack and leak her emails, and they did create fake accounts, and had troll farms targeted at American voters.
I can see the hesitancy to acknowledge that, because it could discredit your own win, but it's also a very interesting new scenario that didn't exist before, which is that foreign powers didn't have direct means of communication with such a large portion of your citizenry as we do now.
If I were to speak for the past, it was actually the US media that were one of the few to be able to reach into other countries populace through movies, books, games, and all that. But it was very hard for non-US media to reach Americans, and that's no longer the case.
If they're any good at what they do… :)
This HN thread is basically ground zero for pushing the 'Russiagate' set of arguments: namely, everything from Trump to QAnon and the antivaxxers are all totally organic and Russia didn't do any of it, in spite of their known tactics for seeding disparate opinion groups with chaos and disinformation to produce a state known as 'the Zone' where nobody knows what, if anything, is true or who is real anymore.
I've got to hand it to them, it is one hell of a tactic. It's somewhat less effective if you know it's one of their tactics. Rather than producing despair, it's downgraded to more 'fog of war' tactics. If you don't know about the tactic, despair is a more likely reaction.
2. Most people in Russia do not speak English or do it poorly. Russian accent is very hard to hide even in writing. E.g. we don't have articles in our language, the whole concept of article is so foreign to us, that I, for example, just put them randomly, because even after years I can't understand why and where to use them. So it's easy to spot Russian comments.
3. There wasn't a single proof that Russia meddled in US elections, not even indirect evidence. It's mostly a bombardment of US media: "Russians did it" (and 100500 pages of gibberish without a single piece of evidence). Though there were no proofs, most people read only headlines and start to assume there was actual evidence. There wasn't any.
Also, I don't understand why would Russia support Trump. Any US politic hates Russia the same, be it democrat or republican. US is a fascist state waging fascist wars over resources (Vietnam, Iraq, Libya, Syria) that hates almost all the world because it considers itself a chosen nation and all the rest are savage nations. US inherently hates Russia. Always hated - read newspapers from 19th and early 20th century - the hatred was already there. Always will hate (just read comments under any news on Reddit /r/worldnews - many commenters say that Russia should be nuked, that Russians should be killed, etc).
Why would we care to elect Trump? Did he do anything good for Russia? No.
Actually, he appears to be a sane person unlike, e.g. Biden, so in our interest would be to elect a leader with dementia instead.
Dead Comment
Dead Comment
> A quick scan of the institutions that publish most frequently and influentially about disinformation: Harvard University, the New York Times, Stanford University, MIT, NBC, the Atlantic Council, the Council on Foreign Relations, etc. That the most prestigious liberal institutions of the pre-digital age are the most invested in fighting disinformation reveals a lot about what they stand to lose, or hope to regain. [...] However well-intentioned these professionals are, they don’t have special access to the fabric of reality.
To me, the author is being clear: the "big disinfo" institutions which are claiming that disinformation is a huge problem are overwhelmingly on the left, and they are scrounging for scraps of evidence. The author seems deeply skeptical in an interesting way.
Deleted Comment
But partisans have happily decided to associate their worldview with the concept of reality.
People are too much in love with their beliefs - critical thinking problem.
Deleted Comment
The problem I see is one "alternate reality" starts to censor another "alternate reality".
It's just a new form of warfare. The war for control of your mind and the shaping of beliefs about reality.
A world where truth isn't part of the information or power structure means that He which controls the narrative controls reality. Arguably more terrifying that some false options and beliefs in the world.
See also the power of the default option.
Perhaps someone has a strong opinion about a thing. They -know- they are right, and every news source is wrong. Until they find one also right. So far, we agree, that's what they want to hear.
But they keep turning to this source, who is spreading wild theories and untruths in other facets. But they were so right about issue X, they must be right about this too!
I think this pretty much sums up why so many 'conservatives' ended up unvaccinated. I cannot imagine that they set out wanting to hear that vaccines are unsafe or will give you microchips. Something/someone they trusted led them down this awful path.
As much as I support free speech, the last few years have really shown me the dangers of it, as well. But how can you regulate? Any arbiter is going to have a bias.
"Bad News, Selling the story of disinformation".