>The current situation in the US is challenging. Misinformation, disinformation, and conspiracy theories are rampant in every cohort of society.
Most of the higher-quality longitudinal evidence I've seen does not show that there is an increase in believing in conspiracy theories or misinformation (or, as we used to call it, people simply being incorrect). It's simply much more visible now due to the Internet being widely adopted, whereas in the past you'd never know what people across the world (or even your own city) may have been thinking about.
Since there is no stable nor global solution to the problem of deciding who gets to label/regulate information, nor any evidence showing that even with the regulation of information, people actually change their beliefs to the 'correct' ones, I don't personally see merit in this path. The temptation to find a solution is sensible prima facie, but history and human psychology show us that this is a fool's errand.
> Most of the higher-quality longitudinal evidence I've seen does not show that there is an increase in believing in conspiracy theories or misinformation (or, as we used to call it, people simply being incorrect).
Do you have a source for that? Not trying to dispute your claim, just curious.
"There are no major comprehensive, longitudinal studies on Americans’ attitudes toward conspiracy theories, mostly because it was not rigorously measured until about 10 to 20 years ago."
"...reviewed over 120 years of letters to the editor, from 1890 to 2010, for both The New York Times and the Chicago Tribune. In over 100,000 letters, this review showed absolutely no change in the amount of conspiracy theory belief over time. In fact, the percent of letters about conspiracy theories actually declined from the late 1800s to the 1960s and has remained steady since then."
They cite: American Conspiracy Theories - Joseph Uscinski & Joseph Parent.
It seems to me that higher visibility is all but guaranteed to lead to an increase in belief. Simple exposure is one thing: most people aren't going to concoct new conspiracy theories on their own, but they may decide to go in for them when they see them.
Social persuasion/cohesion is another: we're all influenced by the views of the people around us (like it or not). When you see half a dozen people in one of your peer groups talking about $WEIRD_THING, you're naturally inclined to give it some time, and maybe some credence.
I don't disagree about the deep problems with regulating it, though.
You’re going to have to back that up with some evidence, especially considering the specious claims you’ve been making over the last few weeks during the capitol terror attack etc.
I have things like this showing up in my mailbox unsolicited. People believe what it says because it's coming from a group they're familiar with. I'd say there should probably be real legal action taken against things like this...
Well I'd say that the shameless crowds marching to bullshit conspiracies on the streets as well as the support for their head figure who also was spreading shameless bullshit is a quite visible evidence outside the internet.
Some of the things people currently believe are so patently absurd and obviously made up that labeling them as false would not make much difference. It's would be just one more conspiracy vector.
Even if it did, how could the regulators possibly keep up with the gish gallop of misinformation? There are those who seriously examine and rebut the claims of q-anon et al. but can you imagine a more thankless task?
A better approach is to teach good informational hygiene to kids. Pull from a variety of information sources, weight the ones that correct errors, awareness and taxonomy of cognitive biases, stuff like that.
I think in any community that is gratified in some way by a belief, there's a spectrum from actually believing at one end, to enjoying the camaraderie and enjoying the belief as a conscious fantasy at the other end, and I think people who inhabit the "conscious fantasy" end can slide towards the "true believing end" via suspension of disbelief, like people do while enjoying a movie or a role-playing game.
When the community is something like a sci-fi fandom community, people on the actually believing end of the spectrum would be considered mentally ill, while people on the camaraderie-and-fantasy end are considered well-adjusted. In a religious community, it's more complicated, and people on either end of the spectrum might be accepted or stigmatized depending on the religion itself and the point of view of the beholder.
Either way, if you think of fandom communities and religious communities as analogues, you wouldn't expect these communities to go away until people can no longer get gratification they want from them. I don't know how that can be accomplished.
Informational hygeine is already part of the high school curriculum in every state I'm familiar with. Maybe we could make it better, but I struggle to think of any specific reforms that'd help.
The current information hygiene and media literacy curriculum as I understand it leans heavily on using 'reputable sources'. This is well and good until those sources start to lose their reputability by publishing stories that stretch credulity (WaPo and the NYT with their 'anonymous intelligence sources familiar with the thinking of people near the matter'), become increasingly partisan, or simply appear too selective in their coverages with the benefit of hindsight. Noticing these things invalidates the majority of the curriculum in a student's mind, and leaves them unequipped to deal with the beasts wandering the information wilderness.
If I were to reform this, I'd place greater emphasis on information composition, and source/author auditing. Acknowledge that mainstream publications often miss things that less reputable outlets will cover (albeit poorly), and teach students to think in degrees of certainty.
It's less than a single semester of content, and you don't really practice it, since you're pretty much guaranteed to be spoonfed authoritative information anyway.
I have a pet theory that Informational Hygiene is going to be one of the most important classes in the upcoming years, taking a similar place to sex ed.
Are there any resources on classes in the space? I did a quick Google and didn't see anything that stood out.
As a counter-example, to me being right or approaching truth consistently and in a repeatable manner is much more fun than believing whatever I want to believe.
I know which beliefs I'd like to be true, but self-deception isn't as fun as being right; right as in claims made within an independently validated methodological framework that makes my claims reproducible.
That's isn't inherent, I assume, but learned.
> Teaching will not solve the problem with incentives -
Citation desperately needed!
The, arguably, oldest human system of hierarchical organization called Church would like to have a word with you: a system based purely on psychological and social incentives, nothing else, giving relevance to itself and propagating certain world-views, beliefs and thought-systems.
“ Media firms work backward. They first ask, “How does our target demographic want to understand what’s just unfolded?” Then they pick both the words and the facts they want to emphasize.”
> Most progressive thinking is moving towards more legalization and treatment, vs banning and punishment... Treat it like we’re starting to treat drugs
I'm all for the direction that drug policy is going (towards more treatment, less prison time - I live in Oregon and we're on the forefront of this in the US) but I'm not sure how this would work in regards to misinformation. How do we get conspiracy theorists into treatment? Won't they just figure that's part of the conspiracy to re-educate them? And a lot of this disinformation is coming from bad actors with intent to harm.
The epistemic gap is, I think the most serious problem we face now. We can't address other serious problems like climate change or even getting COVID vaccines out to people because there's disinformation. We seem to be completely unprepared to deal with this. The immune system offers an analogy - we need a societal immune system that attacks disinformation in some way. No idea how we get there, though.
Re-education is a tainted word, sadly. If someone is indoctrinated and fed only lies and half truths then the only real solution is to be educated on truth. Learning from the work of others is easiest, but doing ones own experiments is certainly possible.
> even getting COVID vaccines out to people because there's disinformation.
During the last year, we have seen the following claims from public health authorities affiliated with well-regarded scientific institutions:
- Vitamin D does nothing
- Vitamin D helps
- You should wear masks
- Masks are not actually that helpful
- The Trump attempts to stop travel from China are racist and terrible (Schumer, Pelosi)
- We need to shut down most transportation and not just airports
- We need to lockdown all non-essential businesses
- Hollywood is an essential business, but churches and restaurants aren't
- Lethality of COVID-19 is 2-5%
- Lethality of COVID-19 is ~0.2-0.4%
And just in the last couple days (highly coincidentally with no bearing at all to politics), there has been a flood of reports that Chicago, NY, and San Francisco are using to justify full openings of their economy. Those scientific reports say that the lockdowns are at best useless.
At what point have the "experts" been right this year? Which of those things should we have labelled misinformation? Attempting to install a technocratic version of truth has never worked, and we should have the grace to allow "wrong" things to be said. The experts, so-called, have been inconsistent and wrong through this whole debacle.
Institutions and experts have always been inconsistent. Just look at nutrition science - we've gone from fats are bad, to no carbs are bad, to no wine is actually good. The expectation that the scientific community would find consensus on a novel virus in a matter of months is unreasonable. Personally I've taken @nntaleb reasoning on it - whether or not masks work with certainty is irrelevant; we have good enough data to suggest it works so we take the insurance and wear them. There is nothing wrong with a healthy amount of skepticism; and claims are often muddied through the low pass filter that is Twitter (ex. experts warned that singling out China was racist when American cases came from Italy; we needed far more policy than single out the Chinese).
That said, there a mountainous gap between "we are unsure if masks work" to "the president is fighting an underground shadow cabal of pedophiles and blood drinkers." COVID deniers aren't suggesting we don't take vaccines because they are unsure of the effectiveness of vaccines - they are suggesting we don't take vaccines because they are actually microchips by Bill Gates designed to track us.
There is a clear difference between misinformation and just plain being wrong - there's a difference between telling me LeBron James is going to win a championship next year and telling me LeBron is actually an alien from Mars who came here to trick us into watching basketball so they could steal all our water. If the President was wrong about LeBron James winning, he would be wrong. If the President was trying to convince me of the latter, then I think that is misinformation.
The grace to be wrong has been increasing replaced by some sort of fight or flight response to being challenged. It’s not wrong to say that there are life and death consequences w/r/t to the important issues. That said, the increasingly emotional response to that is detrimental to discourse. The more hysterical people become advocating their point of view, the more difficult it is to hear their arguments. When someone is extremely emotional it triggers a natural skepticism. Not because we think they are wrong per se, rather because we know that when we are emotionally distressed it is more difficult to be rational.
It's certainly not a good idea to identify anything that deviates from the public expert consensus as "misinformation", but that doesn't mean there isn't a problem of verifiably wrong claims spreading. For example, Chicago and San Francisco have not fully opened their economy. I'm skeptical that the resolution to this problem looks anything like "information control", but it seems terribly naive to expect the problem will resolve itself when it's so easy to be ensnared by it.
I would classify this (and much of the current discourse about mis/disinformation) as "not even wrong".
It's difficult for me to express my thoughts on the matter without writing a huge, unintelligible screed.
The author, like many others, considers misinformation a disease that can be cured by using authority to administer objective information.
I feel strongly that we no longer have any choice but to accept all information is based in trust. This is not meant to be a metaphysical statement. Perhaps at the metaphysical level, objective truth exists. Regardless, at the scale society needs to verify information, objectivity is inaccessible.
Consider how long it took Russel's Principia to add 1+1. For the claims we encounter in our everyday lives, an appeal to objectivity will only add another layer of obfuscation.
In the case of current crisis in the US, the government claims that the crisis is an attack by the Russians. Perhaps this is true on a superficial level, but the attack is only possible because sources of power in our society abuse that society's trust so heavily. This isn't only limited to the government, but the entire structure of power.
The Q conspiracy is on its face absurd and easily contradicted by reliable observations. Why do adherents engage in it? At some point when you're surrounded by lies on every side and no way to understand your environment, your brain just melts.
Unfortunately there is no sign that society will even stop digging itself into this hole.
> Unfortunately there is no sign that society will even stop digging itself into this hole.
I think that for things to change they will need to reach a breaking point. I hope and pray that we will be able to get through this breaking point and to the other side to a better civilization before some x-level threat destroys us.
Dear fellow HNers, this is some of the most dangerous rhetoric I've seen all week.
It's philosophically true that all information is based in trust. Yet, I should warn you that all trust is reliant on assumption. Thus, we end up in a paradoxically, revolving door-esque statement which has no end.
Please ignore this poor lad, this comment is unintelligible, and unreasonable... I'd honestly respect anyone who can seek the truth out for themselves.
I don't feel that I'm particularly unreasonable. I'm certainly interested in understanding your perspective.
I hope I'm not misinterpreting your statements, but it appears you agree that information is rooted in trust. In fact, you say it's philosophically so, while I make a weaker claim about practicality.
A paradox would arise if I were to say "information is based in trust, trust no one". But I would instead say "information is based in trust, empirically evaluate your basis for trust".
I'm not sure I recognize what's dangerous about the claim, or what your proposed alternative is.
Took me too long to realize following: when a person says “there’s too many lies in the media and I can’t believe anything anymore”, they do not have anything concrete about said lies but sheer volume of apocryphal narratives thrown at them is causing erratic behaviors.
It’s a known incorrect error message that human interpret statements as “baseless”, “false” or “lie” in certain corner cases, like how some computer GUI shows “disk full” or “permission denied” or “irrecoverable hardware error” for running out of inodes or exceeding filesize limits. It should not be taken literally.
I can’t immediately point at a solution but overloading them further by giving more precise informations isn’t one.
The article fails to make a distinction between misinformation (untruthful, false information that is spread, regardless of intent to mislead) and disinformation (dishonest, deliberately misleading or biased information). All disinformation is misinformation *, but not all misinformation is disinformation. And I don't see how a comparison with drugs is helpful.
There are two things that can be done:
One is requiring misinformation to be corrected, with penalties for disinformation. This won't be popular with free speech advocates, but maybe they should reconsider whether malicious lies (which have adverse consequences) ought to be protected. There's also the more practical problem that the disinformation might not be detected, or detected too late.
The other is to ensure that people are educated from an early age to spot dubious information and fallacious reasoning, and learn how to fact-check sources. An understanding of science (along with its acceptance of being proven wrong), history, and folklore is also helpful.
The article title is a bit of a misnomer. Television has been with us for about a century now, and although it was not digital until more recently, TV likewise fits the definition of “a medicine or other substance which has a physiological effect when ingested or otherwise introduced into the body.”
Misinformation has always been a part of television, whether knowingly or not. Most people will fall down when shot: not because a bullet actually knocks you over, but because you’ve learned that that’s what happens from watching TV shows and movies.
Even without that, the parallels are clear, especially if we take “drug” to mean a recreational or illicit drug. Terence McKenna (who had lots of wild ideas that must be individually evaluated; you can’t trust him) said decades ago that if there was a pill you could take that encouraged you to sit on your couch for 4+ hours a day and “blot out the real world,” they’d have made it illegal ages ago.
I am not dismissing the dramatic rise of misinformation over the past decade or so. To me, it looks a lot more like the spiking of heroin with fentanyl than the heroin itself. The addiction was already present; it’s just gotten a lot more dangerous to consume.
Most of the higher-quality longitudinal evidence I've seen does not show that there is an increase in believing in conspiracy theories or misinformation (or, as we used to call it, people simply being incorrect). It's simply much more visible now due to the Internet being widely adopted, whereas in the past you'd never know what people across the world (or even your own city) may have been thinking about.
Since there is no stable nor global solution to the problem of deciding who gets to label/regulate information, nor any evidence showing that even with the regulation of information, people actually change their beliefs to the 'correct' ones, I don't personally see merit in this path. The temptation to find a solution is sensible prima facie, but history and human psychology show us that this is a fool's errand.
Do you have a source for that? Not trying to dispute your claim, just curious.
"There are no major comprehensive, longitudinal studies on Americans’ attitudes toward conspiracy theories, mostly because it was not rigorously measured until about 10 to 20 years ago."
"...reviewed over 120 years of letters to the editor, from 1890 to 2010, for both The New York Times and the Chicago Tribune. In over 100,000 letters, this review showed absolutely no change in the amount of conspiracy theory belief over time. In fact, the percent of letters about conspiracy theories actually declined from the late 1800s to the 1960s and has remained steady since then."
They cite: American Conspiracy Theories - Joseph Uscinski & Joseph Parent.
Quotes from here:
https://theconversation.com/are-conspiracy-theories-on-the-r...
Social persuasion/cohesion is another: we're all influenced by the views of the people around us (like it or not). When you see half a dozen people in one of your peer groups talking about $WEIRD_THING, you're naturally inclined to give it some time, and maybe some credence.
I don't disagree about the deep problems with regulating it, though.
I have things like this showing up in my mailbox unsolicited. People believe what it says because it's coming from a group they're familiar with. I'd say there should probably be real legal action taken against things like this...
http://prntscr.com/wweheo
http://prntscr.com/wwexix
http://prntscr.com/wwf2eg
Even if it did, how could the regulators possibly keep up with the gish gallop of misinformation? There are those who seriously examine and rebut the claims of q-anon et al. but can you imagine a more thankless task?
A better approach is to teach good informational hygiene to kids. Pull from a variety of information sources, weight the ones that correct errors, awareness and taxonomy of cognitive biases, stuff like that.
When the community is something like a sci-fi fandom community, people on the actually believing end of the spectrum would be considered mentally ill, while people on the camaraderie-and-fantasy end are considered well-adjusted. In a religious community, it's more complicated, and people on either end of the spectrum might be accepted or stigmatized depending on the religion itself and the point of view of the beholder.
Either way, if you think of fandom communities and religious communities as analogues, you wouldn't expect these communities to go away until people can no longer get gratification they want from them. I don't know how that can be accomplished.
https://en.wikipedia.org/wiki/Three_men_make_a_tigerhttps://zh.m.wiktionary.org/zh-hans/%E4%B8%89%E4%BA%BA%E8%A1...
From 'learning from any 3 random wanders' to 'fear the tiger of 3'. Was that actual wisdom or cynical conclusion of human behaviour?
If I were to reform this, I'd place greater emphasis on information composition, and source/author auditing. Acknowledge that mainstream publications often miss things that less reputable outlets will cover (albeit poorly), and teach students to think in degrees of certainty.
I have a pet theory that Informational Hygiene is going to be one of the most important classes in the upcoming years, taking a similar place to sex ed.
Are there any resources on classes in the space? I did a quick Google and didn't see anything that stood out.
I recall being taught that the civil war was about "state's rights" -- even the teacher rolled her eyes on that one.
As a counter-example, to me being right or approaching truth consistently and in a repeatable manner is much more fun than believing whatever I want to believe.
I know which beliefs I'd like to be true, but self-deception isn't as fun as being right; right as in claims made within an independently validated methodological framework that makes my claims reproducible.
That's isn't inherent, I assume, but learned.
> Teaching will not solve the problem with incentives -
Citation desperately needed!
The, arguably, oldest human system of hierarchical organization called Church would like to have a word with you: a system based purely on psychological and social incentives, nothing else, giving relevance to itself and propagating certain world-views, beliefs and thought-systems.
From Matt Taibbi’s https://taibbi.substack.com/p/we-need-a-new-media-system
I'm all for the direction that drug policy is going (towards more treatment, less prison time - I live in Oregon and we're on the forefront of this in the US) but I'm not sure how this would work in regards to misinformation. How do we get conspiracy theorists into treatment? Won't they just figure that's part of the conspiracy to re-educate them? And a lot of this disinformation is coming from bad actors with intent to harm.
The epistemic gap is, I think the most serious problem we face now. We can't address other serious problems like climate change or even getting COVID vaccines out to people because there's disinformation. We seem to be completely unprepared to deal with this. The immune system offers an analogy - we need a societal immune system that attacks disinformation in some way. No idea how we get there, though.
I mean, it would literally be re-education.
During the last year, we have seen the following claims from public health authorities affiliated with well-regarded scientific institutions:
And just in the last couple days (highly coincidentally with no bearing at all to politics), there has been a flood of reports that Chicago, NY, and San Francisco are using to justify full openings of their economy. Those scientific reports say that the lockdowns are at best useless.At what point have the "experts" been right this year? Which of those things should we have labelled misinformation? Attempting to install a technocratic version of truth has never worked, and we should have the grace to allow "wrong" things to be said. The experts, so-called, have been inconsistent and wrong through this whole debacle.
That said, there a mountainous gap between "we are unsure if masks work" to "the president is fighting an underground shadow cabal of pedophiles and blood drinkers." COVID deniers aren't suggesting we don't take vaccines because they are unsure of the effectiveness of vaccines - they are suggesting we don't take vaccines because they are actually microchips by Bill Gates designed to track us.
There is a clear difference between misinformation and just plain being wrong - there's a difference between telling me LeBron James is going to win a championship next year and telling me LeBron is actually an alien from Mars who came here to trick us into watching basketball so they could steal all our water. If the President was wrong about LeBron James winning, he would be wrong. If the President was trying to convince me of the latter, then I think that is misinformation.
Basically: everyone needs to calm down.
It's difficult for me to express my thoughts on the matter without writing a huge, unintelligible screed.
The author, like many others, considers misinformation a disease that can be cured by using authority to administer objective information.
I feel strongly that we no longer have any choice but to accept all information is based in trust. This is not meant to be a metaphysical statement. Perhaps at the metaphysical level, objective truth exists. Regardless, at the scale society needs to verify information, objectivity is inaccessible.
Consider how long it took Russel's Principia to add 1+1. For the claims we encounter in our everyday lives, an appeal to objectivity will only add another layer of obfuscation.
In the case of current crisis in the US, the government claims that the crisis is an attack by the Russians. Perhaps this is true on a superficial level, but the attack is only possible because sources of power in our society abuse that society's trust so heavily. This isn't only limited to the government, but the entire structure of power.
The Q conspiracy is on its face absurd and easily contradicted by reliable observations. Why do adherents engage in it? At some point when you're surrounded by lies on every side and no way to understand your environment, your brain just melts.
Unfortunately there is no sign that society will even stop digging itself into this hole.
I think that for things to change they will need to reach a breaking point. I hope and pray that we will be able to get through this breaking point and to the other side to a better civilization before some x-level threat destroys us.
It's philosophically true that all information is based in trust. Yet, I should warn you that all trust is reliant on assumption. Thus, we end up in a paradoxically, revolving door-esque statement which has no end.
Please ignore this poor lad, this comment is unintelligible, and unreasonable... I'd honestly respect anyone who can seek the truth out for themselves.
I hope I'm not misinterpreting your statements, but it appears you agree that information is rooted in trust. In fact, you say it's philosophically so, while I make a weaker claim about practicality.
A paradox would arise if I were to say "information is based in trust, trust no one". But I would instead say "information is based in trust, empirically evaluate your basis for trust".
I'm not sure I recognize what's dangerous about the claim, or what your proposed alternative is.
It’s a known incorrect error message that human interpret statements as “baseless”, “false” or “lie” in certain corner cases, like how some computer GUI shows “disk full” or “permission denied” or “irrecoverable hardware error” for running out of inodes or exceeding filesize limits. It should not be taken literally.
I can’t immediately point at a solution but overloading them further by giving more precise informations isn’t one.
There are two things that can be done:
One is requiring misinformation to be corrected, with penalties for disinformation. This won't be popular with free speech advocates, but maybe they should reconsider whether malicious lies (which have adverse consequences) ought to be protected. There's also the more practical problem that the disinformation might not be detected, or detected too late.
The other is to ensure that people are educated from an early age to spot dubious information and fallacious reasoning, and learn how to fact-check sources. An understanding of science (along with its acceptance of being proven wrong), history, and folklore is also helpful.
* There are rare exceptions to this.
Misinformation has always been a part of television, whether knowingly or not. Most people will fall down when shot: not because a bullet actually knocks you over, but because you’ve learned that that’s what happens from watching TV shows and movies.
Even without that, the parallels are clear, especially if we take “drug” to mean a recreational or illicit drug. Terence McKenna (who had lots of wild ideas that must be individually evaluated; you can’t trust him) said decades ago that if there was a pill you could take that encouraged you to sit on your couch for 4+ hours a day and “blot out the real world,” they’d have made it illegal ages ago.
I am not dismissing the dramatic rise of misinformation over the past decade or so. To me, it looks a lot more like the spiking of heroin with fentanyl than the heroin itself. The addiction was already present; it’s just gotten a lot more dangerous to consume.