Based. Nullius in verba [1]. The whitewashing of censorship is accelerating the erosion of trust in the institutions, and it's refreshing to see a long-standing institution take a bold stance.
The problem is that we sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment.
What we can do is delegate the process.
There’s a difference between saying “whatever X says is true, because X said it”, and “I trust what Y says because Y has explained the process and methods of how it was derived, and I don’t have reasonable doubt to assume Y lied”.
What to do when you have two very different opinions coming from experts:
- Anthony "I am The Science" Fauci
or
- Dr Malone (mRNA inventor)?
Who to believe?
Difficult to say, but these days, sadly, the first (and prudent) thing to do is to follow the money.
While credentials matter, the first thing I check when I read a paper/publication is who funded the work.
> The problem is that we sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment.
Implying that anyone has to have expert knowledge.
> I trust what Y says because Y has explained the process and methods of how it was derived, ...
Doesn't that directly contravene what you said before?
> ... and I don’t have reasonable doubt to assume Y lied”.
Completely different issue, same problem though. Reformulated by sylogism, you are basicly saying "I trust that I don't have reasonable doubt, because we can’t all be experts in everything. We can’t all reproduce or witness every experiment.", which is really a definition for what you might consider reasonable. It's a recursive definition at second order when "What expert X says is true, because they do not have reasonable doubt. They sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment." Whereby the default case for the recursion is "I don't trust X, because Y said they's an ass" or something like that where being an arse can come in many coloures.
Of course you have to rely on experts, that is not the criticism. The criticism is that people with opposing views were silenced and declared as spreaders of misinformation. It is not hard to craft a counter argument if you have evidence for it. Just stating that you believe that X is wrong is enough. Having X removed hints that you do not have contradicting evidence. If said evidence is not politically communicable, you still should not ban people and remain on your position.
I see the root cause of all this debate as a failure in scientific communication.
When it comes to complex scientific decisions with significant impacts on the public, nuanced and detailed justifications are required. Instead, we often get simple declarations written by PR departments at juvenile reading level. This spawns chaotic and poorly conducted debate in all walks of life.
What is required is a robust and transparent framework for honest analyses of topics.
For example, if the CDC has a specific recommendation, They should provide an outline of the arguments, counter-arguments, assumptions, and supporting data for each part.
It seems that the status quo is to completely ignore any arguments against a given recommendation. This suffocates any honest discussion in the crib. It also suggests to some that the justifications are not robust enough to survive the light of day. If you cant show your work, people will be skeptical that you did it all.
This is applicable to any science based public policy, but especially obvious to covid policy.
I am pretty skeptical that complex science can be explained to normal people. I just tried to explain to my school that since my child tested PCR positive then negative for COVID, he shouldn't take a PCR test for 90 days (highly prone to reporting positive even when he is non-infectious), but instead (like the doctor's note says) should take an antigen test, and if he tests negative on that (and has no symptoms) he can return to school.
I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying.
Instead, they are insisting on continued PCR tests (for "more data" even though I have a doctor's note saying to use antigen tests.
This is a symptom of no centralized framework. You don't need to people to memorize or understand the entire message, but a comprehensive reference source. We ship manuals with cars, we don't expect people to memorize it at the dealer.
In your case, wouldn't it be helpful to have a a recognized stand reference you could just point them to and let them read, instead of having to explain it and point to some random web page faq that they may not trust?
> I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying.
If I detected you were being condescending I would do exactly this. Mostly to piss you off for thinking you’re better / smarter than me. You may know more than this staff member on this subject but there’s an appropriate way to explain things. Assuming from the get go they’re too stupid / uneducated is offensive.
You can't, discuss a nuanced topic without detail, but It is solvable with parallel mediums. Not every debate fits in a tweet so you issue long form communication in parallel. This is common practice for many topics. You wouldn't expect a nuclear power plant or rocket company to conduct all their internal business in self contained tweets. You expand the medium used to provide the bandwidth nessicary. You might have a power point with a single slide summary, several dozen supporting slides, and then a pdf going deeper.
I would dispute the idea that nuanced /messaging/ can't happen at scale (discussions are a different thing altogether). While I understand that everyone is up in arms about COVID-19 communications these days there are lots of instances around the world of effective communication that in the end had to be fairly nuanced.. most populations understand that vaccines are useful and effective but not perfect, most populations do understand that their particular risk is low but population-level risk is quite high.
I guess we can quibble about the definition of "nuance" but the idea that vaccine science and hospital capacity management is very direct doesn't hold much water with me, maybe others disagree, IDK?
Whether you communicate, or do without communicating, there needs to be accountability.
When there's no accountability, it boils down to power games, which is what we've always had, because as soon as you start playing accountability, you have to justify why you have power and I don't, and that makes people in power extremely nervous, because most of them are really not that bright and any requirement to use rationality exposes it all too clearly.
Totally agree. There's an inherent tension between protecting the freedom of speech and the potential harm that speech might cause. The current incentive structure motivates people to say the most click-baity outlandish things without worrying about any of the consequences. Fact-checking can never catch up with all the crazy sh!t people come up with. That's why censorship feels like a tempting "easy way out" for combating misinformation.
Maybe one mitigation is to make public figure / media accountable to the avoidable damage their speech end up causing? E.g., if I listen to your anti-vax radio program and consequently decide not to get vaccinated, before catching the disease and dying, then my family can sue you for damage as long as it can established that you have purposefully misled / failed to assess the potential harm.
After a few class-action lawsuits like this, public figure & media will probably be more careful when they want to spread misinformation.
You would have had a higher vaccine compliance if the dialogue would never have been about mandates. Just give a health recommendation. You wouldn't have reached everyone, but the science also said that you do not need to.
In some countries we will see mandates without any perspective. For the third dosage? The fourth? The opposition is correct when it says you only get your freedom back if you boot out those responsible for this in the first place. No politician comes out and admits mistakes, not even in democracies.
People talk as if a political and a scientific opinion are equivalent. That is obviously not the case most of the time.
I think that many people have a low opinion of the general public, but I think they are more intelligent than people give them credit for.
If you want to someone to take an action, tell them specifically what they have to gain, dont just provide some general platitude that it is good for you.
If you want someone to get boosted, say you have X% reduced chance of death and Y% hospitalization. Make an honest calculation and keep it up to date.
If the data doesn't exist, we have much bigger problems.
Really? I'm speaking from a US perspective, but my opinion is the communication out of the CDC has too much nuance and is too concerned about being scientifically accurate. They forget they're speaking to a largely scientifically illiterate crowd, and that crowd sees the constant back-and-forth not as scientific progress but as a reason not to trust the institution. Never mind that it's often delivered with a tone of condescension people have come to despise from their institutions, and some of their policy recommendations are... debatable to say the least (recommending mass testing for high school sports/band/choir, for a recent example).
Honestly I only trust the CDC because I have the knowledge and capacity to verify what they say and ignore the bullshit. A lot of people don't have that. And there is bullshit to filter.
The message to the public, assuming the CDC is interested in actually convincing people, needs a simple path forward individuals can take to get said individuals something they want, even if it taps into a selfish motivation and isn't 1000% technically optimal. (i.e. Get vaccinated and the mask requirements go away). And it should be repeated in every statement, regardless of new data. It should have been consistent for the past 2 years. It should have a catchy slogan. It should be on fucking billboards. People would still bitch and grumble, and some would never be convinced. But we'd probably have more people vaccinated if that approach had been taken. Instead we confused people and handed ammo to the anti-vaxxers by the truckload, relying on peoples' better natures and assuming a higher average level of education than actually exists in the American populace.
In short the CDC should optimize for leadership and "good enough" solutions, not strict scientific accuracy and trying to save each and every life in their public statements. Publish the hard science/raw data in more obscure releases that only scientists will care about. And they need a better front person than Fauci. Fauci just radiates highly-credentialed-beltway-insider arrogance however nice he tries to sound and however right he may be.
Of course you could argue such a role should be the place of the executive, and the CDC was trying to fill the gap left by a largely absent Trump administration, and by the time the Biden admin took over Fauci's position as front-man/leadership was already too solidified. Maybe so, but the CDC and public institutions in general have failed the task of public leadership, although thankfully have had success in logistics/vaccine development.
I believe this would have been an even worse strategy for policy acceptance. We were and still are dealing with unknowns and these unknowns are able to justify wearing mask as a precautionary measure without a (not so) noble lie being necessary. Society isn't a military unit and has profoundly different dynamics. Some will ignore this advice but the repercussion of that are very likely managable.
This type of PR will not work with a modern information infrastructure anymore or at least significantly less effective.
> handed ammo to the anti-vaxxers by the truckload
It is a stupid image of an enemy and policy was crafted with a steady look on these groups. That is a mistake in leadership if there is one. You have provided them with legitimacy without anything for you to gain. A complete waste for nothing and you elevated your opposition without benefit to you. That is neither competent leadership nor politics. Figuratively not looking them in the eye would be better. They are beneath you and don't even warrant attention. Few politicians would have even used the term, at least the smart ones.
That said, since I oppose mandatory vaccination I am an anti-vaxxer myself in the eyes of many.
The issue that plagues us is a lack of trust in long-standing societal authorities.
We can't follow the entire scientific proof chain for every piece of information we encounter, because we don't have time.
So we rely on authority to some extent, whether that be peer review, government, independent bodies, etc.
We need to be able to trust that these bodies are telling us the truth and aren't seeking to mislead us. Because when we start to doubt them, we then inevitably elect alternative bodies, simply due to limited thinking capacity/time - as explained above it's impossible to do otherwise, no-one derives from first principles every opinion they hold.
The best way, _overall_, to convince someone to do something, is by clearly explaining to them the positives and negatives and letting them come to their own conclusion.
It doesn't always work, and there are specific situations (e.g. someone is holding a gun to your face) in which the cost/benefit analysis is very different - in such a situation, the short term is all that exists, the long term effect of misplaced trust is irrelevant - you simply have to neutralize them.
But in general, I'm absolutely sure that education over coercion is the correct approach for society.
Because if you force them, sure, you've got a short term win, at the long term cost of trust. What is the long term cost of lost trust?
I'm skeptical that education is a solution to collapsing societal trust.
First, the primary organs that would be capable of this education are the media, who have a profit motive to stoke any and all fears (immigration, radiological warfare, world war, coffee being carcinogenic).
Second, I think this distrust goes beyond "the CDC said something wrong once, I don't trust them as much anymore." That could be solved with education. But we're seeing a more fundamental rise in broad-based distrust of any authority figure, and it's a much more instinctive response.
In a world where authority is more powerful and centralized than ever, some are bound to feel powerless. Someone's actions on the other side of the world have more impact on your life than they ever could (pick depending on your priors: a Wuhan wet market customer's appetite, or a Wuhan Institute of Virology employee's carelessness). With such an interconnected world, how could conspiratorial thinking not be appealing to some people?
Power inherently fosters distrust and cynicism. You can have as much pedagogy and education as you want: rational discourse just can't combat a fundamental instinctive reaction, a growing sense of claustrophobia and powerlessness that some people are bound to feel. And the response to this distrust? More control over discourse, over communication mechanisms[0], and ultimately, over citizens. Governments and corporations are more powerful than they ever were. Surveillance, and enforcement of rules, are easier and more automated than ever. How could our primitive, Dunbar-number brain wiring not react with distrust, or even hostility towards (some) authority figures?
It's not just that the CDC and other institutions of scientific authority 'said something wrong once.' I think most people understand and accept that science is not perfect.
Instead, those institutions have shown that they are willing to publish nobel lies. Specifically, they are willing to bend the truth and publish false information if they believe it is in the best interest of society / greater good. Given an entity that has shown willingness to bend the truth 'for the good of society', isn't distrust and cynicism is rational?
I'm also skeptical. An unprecedented percentage of Americans have well over a decade of full-time education, yet here we are. People question it's quality or assert that some degree-holding buffoon proves it worthless, but I doubt they have convincing empirical evidence that our population knew more in another era.
I agree with you that this won't happen, simply because (via an evolutionary basis) the maintenance of social order comes before the dissemination of truth. The two might coincide, but where they don't, social order comes first.
This is ultimately what happened during the early days of the coronavirus pandemic. It's what happens during fuel scares, it's what happened during 911 as a below poster explains, it's what happens within companies when layoff rumours circulate.
Even within the closest of romantic relationships each partner is not 100% truthful with the other because there are some things that are simply better not said.
Unless the idea is that we descend into some sort of authoritarian nightmare and the general population becomes irrelevant though, I'd argue that social order does require some semblance of trust in power even if only in the basics (e.g. if I don't cross the State, the State won't cross me).
Once that falls apart the whole thing is in ruins.
I wasn't social media that eroded the trust in research and science, it was the introduction of politics.
We need to separate research from politics. It needs to be a separate pillar of the state. You don't want reseachers chasing funds continously and doing short-term research for local sponsors. You want sizeable investments in long-term research projects, high autonomy and following of the scientific method. This way people would trust the results.
Tell an algorithm to optimize for engagement and it will promote the content causing the most engagement: novelty, conspiracy theory, outrage, surprising "revelations" and all sorts of clickbait.
If we removed Twitter, Facebook and Tik Tok and reverted YouTube to the pre-algorithmic (subscription based) era, we'd still have free speech on the internet but probably a lot less bullshit.
Note: Twitter would still promote bullshit even without a ML algorithm. The rules of engagement:
- short content
- individual popularity means content popularity
- retweeting is instant, requires no critical thought
create an environment where clickbaity emotion-viruses spread at exponential rate, and things that require engaging the brain are ignored. Its essentially a crowdsourced rating system for tabloid headlines by design.
I strongly protest the concept that the recent introduction of politics is making things worse than they used to be.
If you ask anyone who was senior working in these institutions in the 1970s, things were far more dodgy back then.
Corruption and bigotry were rampant. But it was far easier to cover up back in the day.
Politicization is mostly driven by transparency.
Politicization is not a new thing, it's a fact that there's a lot of extremely conservative individuals in each country. And technology has bought them closer to their institutions.
Society is being brought very closely together, and we're working through the teething issues.
The amount of resources available to spend on research is finite. Distribution of limited resources is politics.
Historically, scientists either were well-off and independent (e.g. Ulug beg, Newton, Cavendish), or worked at the expense of very rich personal patrons (e.g. Tycho). This limited the number of potential scientists, and the amount of science done, quite drastically.
In 18th-20th centuries it was somehow overcome, when partly states, partly industries gave more scientists more resources. By mid-20th century, the system started to develop cracks, as science became increasingly driven by formal metrics, such as impactful publications. The metrics are actively gamed, but worse, since science is unpredictable, honestly checking everything and building a picture of reality became a worse strategy than going for a flashy if more weakly researched result.
> I wasn't social media that eroded the trust in research and science, it was the introduction of politics.
It's not like the medical system was a highly trusted and respected institution that politicians somehow undermined in the last few months. The lost of trust in the medical system has been literally over 150 years in the making. If you read some books on the history of the rise of homeopathy in the United States in the mid 1800s, the parallels with today aren't exactly subtle.
I would argue that it wasn't just social media. Before the internet, it was relatively easy for a politician to 'pivot', to say something to one group, but quite a different thing to another. Now it is carefully catalogued and these days carefully tracked by anyone with any interest to care.
Social media just put some gas on a fire with engagement optimization. The rest was just human nature.
Thanks for your comment. I broadly agree with your points and find them insightful.
In particular I appreciate that you model authority as an analogue for trust, and trust as an analogue for independent verification.
I think trust is a prerequisite for successful convincing, because it is hard for human brains to overcome the emotional and relational elements of discourse, and we need to be in a receptive frame of mind. [citation needed]
We should study how authorities have built and lost trust, but I don't think we should be rebuilding those authorities using this knowledge. Instead I think focusing on how members of the public and different experts and scholars can build more diverse networks of trust relationships would create a more resilient system in the Internet age.
For example, instead of "I trust the CDC because they are an authority on disease" or "I trust Fauci on COVID because he is an immunologist", what if the common relationship was "I trust Fauci on COVID because his work on Zika and HIV was very interesting".
A related problem is what I call the 'knowledge gap'. Researchers develop a very comprehensive understanding of their chosen subjects, whereas members of the public have only a lay-person level of understanding. As experts become increasingly experienced and knowledgeable, the gap between lay-person understanding and expert comprehension grows. This means researchers are not always able to understand what a member of the public needs to know or how to explain it: how do you compress years of learning into a few minutes of talking, as part of a conversation? Good science communication skills are essential for experts if they want to develop trust relationships with members of the public, and if we as a society want to move from authority and coercion and towards education.
I've posted elsewhere in the thread about my own opinion that I believe the main issue we have is not with trust in the scientific sense (e.g. I think that most people generally _do_ believe that epidemiologists know what they're doing and that the outliers are exceptions), but with trust in the plan of action e.g. the entire chain of process.
Examples, all primarily via trust (I'm not a biologist):
I believe that excessive consumption of sugar can cause diabetes.
I know that alcohol can (will, given enough time) cause liver disease.
I know that driving my car drastically increases my risk of being in a car accident and therefore meeting a swift end, an agonising end, or perhaps a lifetime of disability.
I still engage in those activities despite the downsides because I believe them to be of net benefit, both to myself and to society as a whole. I'm willing to fully argue through that process.
Where I think the issue lies at present is that we're sorely lacking in convincing full-chain argumentation.
A person can trust that the UK health authorities know their stuff, they can believe that coronavirus exists and can be dangerous, but that doesn't solve the problem of whether or not they should skip out on their friend's birthday party or not. It only makes up a very small part of the jigsaw puzzle, the rest still has to be filled in somehow.
I actually think we may learn a lot by studying whom someone distrusts and the underlying causes. I imagine we would find that most of the times, the distrust is projected distrust—in other words, I distrust X person in my life because they did A thing and person Y is like X therefore I distrust person Y as well.
For example, I think many distrust Fauci less because of Fauci himself and his behaviors and more because his behaviors may remind them of their primary care physician who treated them condescendingly, saying they were stupid for asking a question. Or because their parent got Alzheimer's or cancer and don't know where it came from and were told by doctors that their family member would be fine, even though their family is not fine.
I guess I just wished we paid more attention to the transgressions that happened to find the presumed cause of distrust so we can repair it, because I think trust can be highly efficient and distrust highly inefficient at times.
I don't disagree, and it's probably the right option - but most people are lazy thinkers. You want the simplest answer that explains the problem in a way you understand.
If someone is trying to take advantage of a situation that's inherently complicated or unpleasant - it's easy to do so with a simple message that makes it digestible, even if it's wrong or harmful. We've seen this time and time again over the last few years. And then worse still, social media perpetuates it by keeping you in a bubble and pushing you further down a path.
You could be right, and as far as I can tell - that's what we've done.
But is it sustainable to lie to people in order to effect a desired (even if positive) change? How long can it go on for before people just turn off?
Maybe the answer is indefinitely. Empirically at least in my country this doesn't seem to be the case.
I've heard this been described as "pandemic fatigue", but it seems to more revolve around taking orders than believing in or being concerned about the pandemic itself.
People are willing to follow counterintuitive rules for a while, but eventually they need to understand some sort of final benefit derived.
The vast majority of people don't get "not drinking bleach fatigue" because they know it's bloody awful and that if they don't drink it, they're gone.
I see that as a failing of the policy setters and communicators. You can communicate a simple message, but you have to show your work too. Pretending counter-points don't exist only leads to "gotchas" and distrust down the road. This has been the the theme of last few years for me. If you want more than a public policy soundbite, you should be able to get a public position paper describing the assumptions, arguments, and counter-arguments.
This part is a huge society-wide problem that can theoretically be fixed so people can become more informed and active participants in the framework of their daily lives. Don’t ask me how, though, because I just got off work and am too exhausted to do anything except let the news tell me what to be afraid of today, then probably binge The Office and slam a few brewskis until I fall asleep on my couch ‘cause I got work again tomorrow
Even if I do have time and energy, my lack of knowledge on the subject matter would probably lead me to waste that time on dead ends, wrong conclusions, or learning things I will never use again outside of fact-checking one study.
We, as a society, bought a dog, and it took at literary shit in our communnal kitchen. No one wants to clean it up, and frankly, I can't blame you/me/any one individual for it. It's exhausting but a society-wide problem. It sucks that the only response from platforms has been to ban the discussion, rather than pay for a writers corp to come up with easy to parse FAQs why some of the garbage misinformation is a pile of lies, and to write systemic takedowns of odious dreck. But here we are.
The CDC recently dropped to 50% public trust, after repeated failures, mixed messages and outright lies in some cases.
They lie and then act incensed when the public stops believing them. Given the massive damage they've done, I'd consider it justified.
The CDC is an incompetent, bloated and fundamentally dishonest organization that was perfectly comfortable throwing grandma to the wolves via lies (masks don't work!) if it meant their unpreparedness wasn't exposed.
Via anti-masking lies, CDC irreparably damaged their reputation and also likely got people killed who believed them. The damage is deserved.
I remember pre-covid thinking that the CDC was the only federal agency that had its stuff together, but early on in COVID they really came off as inept. They failed to stockpile PPE (masks and such) or even make provisions for procuring the same. Then they told us masks don't work (as you noted) before eventually flipping on that position. Then they bungled the tests (instead of going with the WHO test like the rest of the world, CDC rolled its own which didn't work and was much less efficient to process).
Still, I don't share the distrust of a lot of folks regarding their policies and decisions over the last year or so. I think they're probably making reasonable calls for the available evidence even if hindsight eventually proves them wrong. To the extent their critics are correct, I think most will have arrived at their conclusions as a matter of luck (e.g., someone who knows nothing of virology happens to hear some study or other that hints that the CDC may be wrong and bases their entire worldview on that study without any awareness of the counter evidence).
For long term scientific correctness, what the hacker news crowd could help with tremendously is reproducibility and that means versioning and continuous integration with data, for research software. In context of a paper or doing research, you have to write software anyway.
The other one that pops to mind is probably statistical calculation helping services.
Maybe there are lots of other things as well - I haven't been involved with scientific research in a long time.
This is not helped by the lack of scientific and logical literacy in the public at large who will cherry pick to support either position on a topic and call that "homework done".
The scientific process involves testing the accepted and pushing ideas to there extremes. The fact there is a test of 5G on mice isn't surprising, but when these show a statistically insignificant blip and it's not in the conclusion that's not "big science hiding the real truths", it's that this blip in isolation is a true statistical anomaly...
Absolutely. At least here in the United States there’s a full on collapse of trust in institutions and supposed expertise and for extremely good reasons.
Over and over again those with authority have squandered it. An opioid epidemic devastated so many communities after doctors, health care executives, and McKinsey and Company said the new class of drugs were non-addicting. Our manufacturing sector evaporated after economists and politicians said free trade would benefit everyone. Thousands of young people were sent to die in the Middle East in a war all the experts said was necessary and would be relatively painless.
Those are just the first three examples off the top of my head. I could keep going. We were promised technology would improve our lives. We were promised borrowing money for education would improve our lives. And on and on and on.
All bullshit. All promulgated by supposed experts with impeccable credentials. It’s a full on disaster and the chickens have come home to roost.
Eg, a doctor at a jail tested Covid treatments on inmates to see if it helped. That's a ground-shaking loss of trust! Where that's one crazy doctor and an isolated case, it's hard for the disenfranchised not to generalize to the rest of the medical establishment.
Edit to add: I found out what happens when you present a video testimonial strongly calling into question vaccine clinical trials - it gets downvoted.
So what happens when you present a video testimonial like this - https://www.youtube.com/watch?v=L2GKPYzL_JQ - and people just dismiss it, people not understanding that with clinical trials you're meant to keep track of and report all outcomes, and not only that - by limiting the scope via an app that has limited options, and no free form input, you're crafting a very narrow scope of outcome, literally doctoring the outcome by minimizing the potential range and severity of adverse events? (If the people even watch the video or understand the implications of what Maddie's mother has shared; and if not, then you're not able to educate or inform them further - probably because it's a counter-mainstream narrative point, arguably the mainstream and government captured, regulatory capture from industrial complexes, etc)
Does the video described build or harm trust of our institutions? Does that this happened with this clinical trial, does that not automatically extrapolate that all of the other clinical trials were ignored, not given proper oversight, nor having integrity to them?
The interesting thing about trust is it takes much longer to build than to spend.
Given the level scientific institutions have been spending trust over the past few years I'm not sure how this problem can be alleviated within our current framework.
Let people draw their own conclusions and don’t censor or otherwise penalize them for questioning one interpretation of data vs another. Encourage people to do their own research and investigate who is funding the sources used for that research. There is no need for conspiracy theories when financial incentives exist.
But people should be helped to draw their own conclusions with annotations where every time some crazy idea is repeated it is automatically flagged with "This idea has repeatedly been debunked by 37 national societies of medicine and 19 meta analyses of 2,000 peer-reviewed studies with a quality deemed high or superb," etc.
Should I be penalized if my interpretation of the Israel data is that Jews are dirty and should be exterminated? That would be a pretty outlandish interpretation! I would HOPE I would be censored for that!
Do you mean "Do their own research" like make up conspiracy theories out of thin air and parrot nonsense deep state rhetoric? Completely divorcing yourself from reality? You want more of that?
I think unfortunately we also need to protect science from idiots making very bad conclusions based on not understanding a paper... That's how we end up with 'bacon cures cancer' headlines in the daily fail...
Trust wasn't lost first. Talking heads swooped in for money or power, and they sell whatever they have to in order to achieve those goals. There isn't any sort of fact they are arguing for in good faith.
You can add any number of layers of scientific rigor and it won't convince someone who has a favorite talking head.
Is it not still that way? There may be a bias towards believing trusted institutions, but that's because they have reputations for delivering truth based on experiments.
> one derives from first principles every opinion they hold
But it's the only correct way, if credible studies contradict an official agency (more often it's the degree of the opinion where there is disagreement, not the opinion itself), my advice would be to trust the studies. For instance MRI contrast agents: the EMA banned most of them while the FDA did not, which is not the advice one would form by reading studies on the matter.
I believe if you take for yourself the right to neutralize or silence an opponent, he gets to do the same in kind. State authority is mostly based on show and relies on people complying, even in autocracies. So you should be very sure about it if you do silence people and about the costs. There was never a point in the recent pandemic or would you disagree?
It is also difficult to come up with an example where it was warranted to suppress information. Can you name one? Educated and intelligent people are often very distrustful, especially of authority.
even if someone doesn't walk us through the entire chain of proof I'd settle for clearly delineated junctions and paper trails--basically full transparency with the ability to augment and shim any gaps people identify along the way. Nothing is implemented by default though but it would be great to have this public body of work where someone can spot check on the fly to see if things line up. It's a lot to ask for with no incentive to push such a thing through but maybe I'm not seeing a potential profit motive behind such a project.
That's ... not true? Children can learn a basic model of the greenhouse effect in school, and do so in most civilized nations. Whether this is sufficient is another thing entirely, but I would personally contend that (non-propaganda) education can never be 100% effective at causing someone to believe something?
> The issue that plagues us is a lack of trust in long-standing societal authorities.
This misses a lot of the cause. People aren't listening on the vaccine because there were no WMDs in Iraq. All the "experts" said that too.
The fact that huge frauds have been perpetuated at scale against our entire society and nobody has been held to account is a major reason for collapsing trust. It would be very helpful if the parties responsible had at least been censured even if the political class couldn't muster the will to truly apply penalties, but few in the political class have even admitted that anything bad happened.
When trusted authorities perpetuate frauds or display extreme incompetence and then just paper it over (often at others' expense), they are destroying the fundamental trust basis of society. Opportunistic politicians are destroyers of civilization.
What if the scientific proof chain was published as a knowledge graph? This could be easily followed and all questions by skeptics could theoretically be automatically answered.
Well the trust has to be earned by a history of proven visibly good outcomes. Never forget that there is a long history of cases where the established scientifically accepted ideas were actually harmfully wrong, particularly in the medical arena. In the USA in particular, medical practice is far from a benign charity, so people are right to question. One example: I had a tonsillectomy when I was a kid. No actual evidence that this is good medicine..
On the other hand, I don't disagree that the level of misinformation is ridiculous. It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals. Maybe there shouldn't be censorship, but where are the medical malpractice claims against these people? I think it's all tied in with the general acceptance of the "these statements have not been evaluated by the FDA" supplement industry. As long as your supplement does not do too much direct harm, we allow it.
> It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals.
I'm less convinced that this is a matter of trust, I think it's more about value systems and priorities.
I trust my dentist to recommend me the best filling or crown material for my teeth, even in the face of financial incentives.
But his advice does not tell me, in an absolute sense, whether it'd be worth me paying $1K for a crown if that means I miss the rent next month. It's not a tractable problem.
I can't speak to the US situation, but in the UK in my peer groups most coronavirus-related argumentation doesn't revolves around whether e.g. wearing a mask reduces the spread of coronavirus, but around whether wearing a mask is a net positive for the society and/or the wearer. The former is a matter for epidemiology, the latter is nowhere near clear cut.
> t there is a long history of cases where the established scientifically accepted ideas were actually harmfully wrong, particularly in the medical arena.
This isn't actually a problem so long as those past errors were caused by the limits of what we knew at the time and efforts are made to help prevent similar issues in the future when it's possible. It's inevitable that as our understating of science and medicine evolve we're going to discover that what made sense before is no longer a good idea.
The problem comes when we weren't wrong because of what we didn't understand, but because people who knew better just thought they could get more money if they manipulated results or outright lied. We had the tobacco industry pay off scientists to lie about the cancer risks the industry knew to be a problem. The resulting rise in people with lung cancer wasn't a mistake. We had doctors pushing opioids on people at insane doses because they were paid kickbacks if they did. That wasn't a mistake either.
What we need is strict regulation and oversight so that when science and medicine do get it wrong, it's because we couldn't have known better given what data we had at the time. That'd be a huge step up from where we are now.
- Accepted fact 1: eating saturated fat causes more coronary heart disease thank unsaturated fat (very high correlation, and accepted as causal)
- Accepted fact 2: eating trans fat is bad. A diet high in trans fats can contribute to obesity, high blood pressure, and higher risk for heart disease, because intake of dietary trans fat disrupts the body's ability to metabolize essential fatty acids. Trans fat is also implicated in Type 2 diabetes.
Historically margarine was low in saturated fat but high in trans fat due to partial hydrogenation - hence the questionable "health benefits" of it vs butter.
Currently in most of the developed world vegetable spreads are not allowed to contain significant amounts of partially hydrogenated oils, and thus margarine should be healthier - but there is a powerful dairy lobby so the bad reputation will last for a long while...
From a social psychology perspective, censorship is known to increase interest in whatever you ban or try to withhold. Censorship also interferes with educating people and helping them improve their understanding.
People often learn their understanding is flawed by saying something "stupid" and getting a response to that. Censorship fosters a climate of fear where people are less likely to say the "dumb" thing and get it explained.
A good policy with raising children is "There are no bad questions. You can ask (parent) anything and will not get in trouble, even if the answer is Wow, that's a really bad word and means (something bad). Please don't use that at school or I will get called by the teacher."
Tenure track changed papers from vehicles to express ideas subject to a strong test, into a burden to jump, for your lifes work security.
Arguably thesis by body of work is less corrosive, but the production of 3-4 papers from the PhD is basically now a signal to the university you can be that performing seal.
That, and IPR. "a new drug which xxx (in mice)" paper is worth significantly more to the company behind it during share price discussions. (Obviously this is shorthand because IPR has risks in premature publication as well)
> For decades, corporate undermining of scientific consensus has eroded the scientific process worldwide. Guardrails for protecting science-informed processes, from peer review to regulatory decision making, have suffered sustained attacks, damaging public trust in the scientific enterprise and its aim to serve the public good. Government efforts to address corporate attacks have been inadequate. Researchers have cataloged corporate malfeasance that harms people’s health across diverse industries. Well-known cases, like the tobacco industry’s efforts to downplay the dangers of smoking, are representative of transnational industries, rather than unique. This contribution schematizes industry tactics to distort, delay, or distract the public from instituting measures that improve health—tactics that comprise the “disinformation playbook.” Using a United States policy lens, we outline steps the scientific community should take to shield science from corporate interference, through individual actions (by scientists, peer reviewers, and editors) and collective initiatives (by research institutions, grant organizations, professional associations, and regulatory agencies).
Feels like too many people think “published paper = truth”, when really the paper is just the start of a journey towards truth, on which hurdles such as replication, independent scrutiny, and common-sense/real-world testing must be overcome before we can start treating them as reliable. And I think most scientists do actually get this, and what we suffer from is a media environment that jumps on the first sniff of a result and holds it up as definitive proof of something.
[1]: http://en.wikipedia.org/wiki/Nullius_in_verba
What we can do is delegate the process.
There’s a difference between saying “whatever X says is true, because X said it”, and “I trust what Y says because Y has explained the process and methods of how it was derived, and I don’t have reasonable doubt to assume Y lied”.
- Anthony "I am The Science" Fauci
or
- Dr Malone (mRNA inventor)?
Who to believe?
Difficult to say, but these days, sadly, the first (and prudent) thing to do is to follow the money. While credentials matter, the first thing I check when I read a paper/publication is who funded the work.
Implying that anyone has to have expert knowledge.
> I trust what Y says because Y has explained the process and methods of how it was derived, ...
Doesn't that directly contravene what you said before?
> ... and I don’t have reasonable doubt to assume Y lied”.
Completely different issue, same problem though. Reformulated by sylogism, you are basicly saying "I trust that I don't have reasonable doubt, because we can’t all be experts in everything. We can’t all reproduce or witness every experiment.", which is really a definition for what you might consider reasonable. It's a recursive definition at second order when "What expert X says is true, because they do not have reasonable doubt. They sometimes have to rely on other people. We can’t all be experts in everything. We can’t all reproduce or witness every experiment." Whereby the default case for the recursion is "I don't trust X, because Y said they's an ass" or something like that where being an arse can come in many coloures.
When it comes to complex scientific decisions with significant impacts on the public, nuanced and detailed justifications are required. Instead, we often get simple declarations written by PR departments at juvenile reading level. This spawns chaotic and poorly conducted debate in all walks of life.
What is required is a robust and transparent framework for honest analyses of topics. For example, if the CDC has a specific recommendation, They should provide an outline of the arguments, counter-arguments, assumptions, and supporting data for each part.
It seems that the status quo is to completely ignore any arguments against a given recommendation. This suffocates any honest discussion in the crib. It also suggests to some that the justifications are not robust enough to survive the light of day. If you cant show your work, people will be skeptical that you did it all.
This is applicable to any science based public policy, but especially obvious to covid policy.
I explained this fairly simply but the message didn't get through (to a reasonably intelligent person) and they thought I was saying the complete opposite of what I was saying. Instead, they are insisting on continued PCR tests (for "more data" even though I have a doctor's note saying to use antigen tests.
In your case, wouldn't it be helpful to have a a recognized stand reference you could just point them to and let them read, instead of having to explain it and point to some random web page faq that they may not trust?
If I detected you were being condescending I would do exactly this. Mostly to piss you off for thinking you’re better / smarter than me. You may know more than this staff member on this subject but there’s an appropriate way to explain things. Assuming from the get go they’re too stupid / uneducated is offensive.
If you think of it as (1), you see failures in scientific communication.
If you think of it as (2), you see an authority expecting to be obeyed.
How can you have scientific communication without nuance? We know nuanced discussions can't happen at scale. Is this even solvable?
https://news.ycombinator.com/item?id=30128061
I guess we can quibble about the definition of "nuance" but the idea that vaccine science and hospital capacity management is very direct doesn't hold much water with me, maybe others disagree, IDK?
Whether you communicate, or do without communicating, there needs to be accountability.
When there's no accountability, it boils down to power games, which is what we've always had, because as soon as you start playing accountability, you have to justify why you have power and I don't, and that makes people in power extremely nervous, because most of them are really not that bright and any requirement to use rationality exposes it all too clearly.
Maybe one mitigation is to make public figure / media accountable to the avoidable damage their speech end up causing? E.g., if I listen to your anti-vax radio program and consequently decide not to get vaccinated, before catching the disease and dying, then my family can sue you for damage as long as it can established that you have purposefully misled / failed to assess the potential harm.
After a few class-action lawsuits like this, public figure & media will probably be more careful when they want to spread misinformation.
The leaked Collins and Fauci emails show there was a deliberate decision to trade truth for control of the narrative. [0]
This isn't incompetence or messaging being too simple. This is a failure of philosophy: that of "noble lies".
Much of our leaders behaviors can be explained through this lens.
Masking, natural immunity, lab leak, etc.
At every turn, the facts were skewed and skeptics were penalized in order to railroad everyone into vaccination.
The failure is that the people in charge don't see this line of action as a failure, but rather a necessary evil. A means to an end.
[0] https://news.yahoo.com/reps-comer-jordan-expose-fauci-160210...
In some countries we will see mandates without any perspective. For the third dosage? The fourth? The opposition is correct when it says you only get your freedom back if you boot out those responsible for this in the first place. No politician comes out and admits mistakes, not even in democracies.
People talk as if a political and a scientific opinion are equivalent. That is obviously not the case most of the time.
If you want to someone to take an action, tell them specifically what they have to gain, dont just provide some general platitude that it is good for you.
If you want someone to get boosted, say you have X% reduced chance of death and Y% hospitalization. Make an honest calculation and keep it up to date.
If the data doesn't exist, we have much bigger problems.
Honestly I only trust the CDC because I have the knowledge and capacity to verify what they say and ignore the bullshit. A lot of people don't have that. And there is bullshit to filter.
The message to the public, assuming the CDC is interested in actually convincing people, needs a simple path forward individuals can take to get said individuals something they want, even if it taps into a selfish motivation and isn't 1000% technically optimal. (i.e. Get vaccinated and the mask requirements go away). And it should be repeated in every statement, regardless of new data. It should have been consistent for the past 2 years. It should have a catchy slogan. It should be on fucking billboards. People would still bitch and grumble, and some would never be convinced. But we'd probably have more people vaccinated if that approach had been taken. Instead we confused people and handed ammo to the anti-vaxxers by the truckload, relying on peoples' better natures and assuming a higher average level of education than actually exists in the American populace.
In short the CDC should optimize for leadership and "good enough" solutions, not strict scientific accuracy and trying to save each and every life in their public statements. Publish the hard science/raw data in more obscure releases that only scientists will care about. And they need a better front person than Fauci. Fauci just radiates highly-credentialed-beltway-insider arrogance however nice he tries to sound and however right he may be.
Of course you could argue such a role should be the place of the executive, and the CDC was trying to fill the gap left by a largely absent Trump administration, and by the time the Biden admin took over Fauci's position as front-man/leadership was already too solidified. Maybe so, but the CDC and public institutions in general have failed the task of public leadership, although thankfully have had success in logistics/vaccine development.
This type of PR will not work with a modern information infrastructure anymore or at least significantly less effective.
> handed ammo to the anti-vaxxers by the truckload
It is a stupid image of an enemy and policy was crafted with a steady look on these groups. That is a mistake in leadership if there is one. You have provided them with legitimacy without anything for you to gain. A complete waste for nothing and you elevated your opposition without benefit to you. That is neither competent leadership nor politics. Figuratively not looking them in the eye would be better. They are beneath you and don't even warrant attention. Few politicians would have even used the term, at least the smart ones.
That said, since I oppose mandatory vaccination I am an anti-vaxxer myself in the eyes of many.
We can't follow the entire scientific proof chain for every piece of information we encounter, because we don't have time.
So we rely on authority to some extent, whether that be peer review, government, independent bodies, etc.
We need to be able to trust that these bodies are telling us the truth and aren't seeking to mislead us. Because when we start to doubt them, we then inevitably elect alternative bodies, simply due to limited thinking capacity/time - as explained above it's impossible to do otherwise, no-one derives from first principles every opinion they hold.
The best way, _overall_, to convince someone to do something, is by clearly explaining to them the positives and negatives and letting them come to their own conclusion.
It doesn't always work, and there are specific situations (e.g. someone is holding a gun to your face) in which the cost/benefit analysis is very different - in such a situation, the short term is all that exists, the long term effect of misplaced trust is irrelevant - you simply have to neutralize them.
But in general, I'm absolutely sure that education over coercion is the correct approach for society.
Because if you force them, sure, you've got a short term win, at the long term cost of trust. What is the long term cost of lost trust?
First, the primary organs that would be capable of this education are the media, who have a profit motive to stoke any and all fears (immigration, radiological warfare, world war, coffee being carcinogenic).
Second, I think this distrust goes beyond "the CDC said something wrong once, I don't trust them as much anymore." That could be solved with education. But we're seeing a more fundamental rise in broad-based distrust of any authority figure, and it's a much more instinctive response.
In a world where authority is more powerful and centralized than ever, some are bound to feel powerless. Someone's actions on the other side of the world have more impact on your life than they ever could (pick depending on your priors: a Wuhan wet market customer's appetite, or a Wuhan Institute of Virology employee's carelessness). With such an interconnected world, how could conspiratorial thinking not be appealing to some people?
Power inherently fosters distrust and cynicism. You can have as much pedagogy and education as you want: rational discourse just can't combat a fundamental instinctive reaction, a growing sense of claustrophobia and powerlessness that some people are bound to feel. And the response to this distrust? More control over discourse, over communication mechanisms[0], and ultimately, over citizens. Governments and corporations are more powerful than they ever were. Surveillance, and enforcement of rules, are easier and more automated than ever. How could our primitive, Dunbar-number brain wiring not react with distrust, or even hostility towards (some) authority figures?
[0]: https://www.reuters.com/world/europe/politician-says-germany...
Instead, those institutions have shown that they are willing to publish nobel lies. Specifically, they are willing to bend the truth and publish false information if they believe it is in the best interest of society / greater good. Given an entity that has shown willingness to bend the truth 'for the good of society', isn't distrust and cynicism is rational?
https://www.nbclosangeles.com/news/coronavirus/cdc-sets-shor...
Deleted Comment
This is ultimately what happened during the early days of the coronavirus pandemic. It's what happens during fuel scares, it's what happened during 911 as a below poster explains, it's what happens within companies when layoff rumours circulate.
Even within the closest of romantic relationships each partner is not 100% truthful with the other because there are some things that are simply better not said.
Unless the idea is that we descend into some sort of authoritarian nightmare and the general population becomes irrelevant though, I'd argue that social order does require some semblance of trust in power even if only in the basics (e.g. if I don't cross the State, the State won't cross me).
Once that falls apart the whole thing is in ruins.
We need to separate research from politics. It needs to be a separate pillar of the state. You don't want reseachers chasing funds continously and doing short-term research for local sponsors. You want sizeable investments in long-term research projects, high autonomy and following of the scientific method. This way people would trust the results.
Tell an algorithm to optimize for engagement and it will promote the content causing the most engagement: novelty, conspiracy theory, outrage, surprising "revelations" and all sorts of clickbait.
If we removed Twitter, Facebook and Tik Tok and reverted YouTube to the pre-algorithmic (subscription based) era, we'd still have free speech on the internet but probably a lot less bullshit.
Note: Twitter would still promote bullshit even without a ML algorithm. The rules of engagement:
create an environment where clickbaity emotion-viruses spread at exponential rate, and things that require engaging the brain are ignored. Its essentially a crowdsourced rating system for tabloid headlines by design.If you ask anyone who was senior working in these institutions in the 1970s, things were far more dodgy back then.
Corruption and bigotry were rampant. But it was far easier to cover up back in the day.
Politicization is mostly driven by transparency.
Politicization is not a new thing, it's a fact that there's a lot of extremely conservative individuals in each country. And technology has bought them closer to their institutions.
Society is being brought very closely together, and we're working through the teething issues.
Historically, scientists either were well-off and independent (e.g. Ulug beg, Newton, Cavendish), or worked at the expense of very rich personal patrons (e.g. Tycho). This limited the number of potential scientists, and the amount of science done, quite drastically.
In 18th-20th centuries it was somehow overcome, when partly states, partly industries gave more scientists more resources. By mid-20th century, the system started to develop cracks, as science became increasingly driven by formal metrics, such as impactful publications. The metrics are actively gamed, but worse, since science is unpredictable, honestly checking everything and building a picture of reality became a worse strategy than going for a flashy if more weakly researched result.
I don't know a good way out of the current trap.
It's not like the medical system was a highly trusted and respected institution that politicians somehow undermined in the last few months. The lost of trust in the medical system has been literally over 150 years in the making. If you read some books on the history of the rise of homeopathy in the United States in the mid 1800s, the parallels with today aren't exactly subtle.
That absolutely will never happen as long as the funding comes from the government.
Social media just put some gas on a fire with engagement optimization. The rest was just human nature.
In particular I appreciate that you model authority as an analogue for trust, and trust as an analogue for independent verification.
I think trust is a prerequisite for successful convincing, because it is hard for human brains to overcome the emotional and relational elements of discourse, and we need to be in a receptive frame of mind. [citation needed]
We should study how authorities have built and lost trust, but I don't think we should be rebuilding those authorities using this knowledge. Instead I think focusing on how members of the public and different experts and scholars can build more diverse networks of trust relationships would create a more resilient system in the Internet age.
For example, instead of "I trust the CDC because they are an authority on disease" or "I trust Fauci on COVID because he is an immunologist", what if the common relationship was "I trust Fauci on COVID because his work on Zika and HIV was very interesting".
A related problem is what I call the 'knowledge gap'. Researchers develop a very comprehensive understanding of their chosen subjects, whereas members of the public have only a lay-person level of understanding. As experts become increasingly experienced and knowledgeable, the gap between lay-person understanding and expert comprehension grows. This means researchers are not always able to understand what a member of the public needs to know or how to explain it: how do you compress years of learning into a few minutes of talking, as part of a conversation? Good science communication skills are essential for experts if they want to develop trust relationships with members of the public, and if we as a society want to move from authority and coercion and towards education.
I've posted elsewhere in the thread about my own opinion that I believe the main issue we have is not with trust in the scientific sense (e.g. I think that most people generally _do_ believe that epidemiologists know what they're doing and that the outliers are exceptions), but with trust in the plan of action e.g. the entire chain of process.
Examples, all primarily via trust (I'm not a biologist):
I believe that excessive consumption of sugar can cause diabetes.
I know that alcohol can (will, given enough time) cause liver disease.
I know that driving my car drastically increases my risk of being in a car accident and therefore meeting a swift end, an agonising end, or perhaps a lifetime of disability.
I still engage in those activities despite the downsides because I believe them to be of net benefit, both to myself and to society as a whole. I'm willing to fully argue through that process.
Where I think the issue lies at present is that we're sorely lacking in convincing full-chain argumentation.
A person can trust that the UK health authorities know their stuff, they can believe that coronavirus exists and can be dangerous, but that doesn't solve the problem of whether or not they should skip out on their friend's birthday party or not. It only makes up a very small part of the jigsaw puzzle, the rest still has to be filled in somehow.
For example, I think many distrust Fauci less because of Fauci himself and his behaviors and more because his behaviors may remind them of their primary care physician who treated them condescendingly, saying they were stupid for asking a question. Or because their parent got Alzheimer's or cancer and don't know where it came from and were told by doctors that their family member would be fine, even though their family is not fine.
I guess I just wished we paid more attention to the transgressions that happened to find the presumed cause of distrust so we can repair it, because I think trust can be highly efficient and distrust highly inefficient at times.
If someone is trying to take advantage of a situation that's inherently complicated or unpleasant - it's easy to do so with a simple message that makes it digestible, even if it's wrong or harmful. We've seen this time and time again over the last few years. And then worse still, social media perpetuates it by keeping you in a bubble and pushing you further down a path.
But is it sustainable to lie to people in order to effect a desired (even if positive) change? How long can it go on for before people just turn off?
Maybe the answer is indefinitely. Empirically at least in my country this doesn't seem to be the case.
I've heard this been described as "pandemic fatigue", but it seems to more revolve around taking orders than believing in or being concerned about the pandemic itself.
People are willing to follow counterintuitive rules for a while, but eventually they need to understand some sort of final benefit derived.
The vast majority of people don't get "not drinking bleach fatigue" because they know it's bloody awful and that if they don't drink it, they're gone.
This part is a huge society-wide problem that can theoretically be fixed so people can become more informed and active participants in the framework of their daily lives. Don’t ask me how, though, because I just got off work and am too exhausted to do anything except let the news tell me what to be afraid of today, then probably binge The Office and slam a few brewskis until I fall asleep on my couch ‘cause I got work again tomorrow
Deleted Comment
They lie and then act incensed when the public stops believing them. Given the massive damage they've done, I'd consider it justified.
The CDC is an incompetent, bloated and fundamentally dishonest organization that was perfectly comfortable throwing grandma to the wolves via lies (masks don't work!) if it meant their unpreparedness wasn't exposed.
Via anti-masking lies, CDC irreparably damaged their reputation and also likely got people killed who believed them. The damage is deserved.
Still, I don't share the distrust of a lot of folks regarding their policies and decisions over the last year or so. I think they're probably making reasonable calls for the available evidence even if hindsight eventually proves them wrong. To the extent their critics are correct, I think most will have arrived at their conclusions as a matter of luck (e.g., someone who knows nothing of virology happens to hear some study or other that hints that the CDC may be wrong and bases their entire worldview on that study without any awareness of the counter evidence).
The other one that pops to mind is probably statistical calculation helping services.
Maybe there are lots of other things as well - I haven't been involved with scientific research in a long time.
A theoretical climate change example might be:
Car A emits 10% less CO2 than car B.
Driving is 30% of your annual emissions (i.e. it's significant and one of the low hanging fruit).
That 30% of your annual emissions if multiplied by 7 billion people is likely to cause X.
If everyone drives car B instead of car A, Y will happen instead.
Y is better than X, because we end up with a more verdant Earth, cheaper food, better weather, less war (insert reason here).
Therefore, our overall quality of life is higher if we switch to car A.
Without the whole information chain, you just have isolated statistics that don't really mean anything.
Convincing argumentation is not only about mathematics, it's about appealing to the things that matter to people.
The scientific process involves testing the accepted and pushing ideas to there extremes. The fact there is a test of 5G on mice isn't surprising, but when these show a statistically insignificant blip and it's not in the conclusion that's not "big science hiding the real truths", it's that this blip in isolation is a true statistical anomaly...
What caused that trust to be lost? Is it something they did?
Over and over again those with authority have squandered it. An opioid epidemic devastated so many communities after doctors, health care executives, and McKinsey and Company said the new class of drugs were non-addicting. Our manufacturing sector evaporated after economists and politicians said free trade would benefit everyone. Thousands of young people were sent to die in the Middle East in a war all the experts said was necessary and would be relatively painless.
Those are just the first three examples off the top of my head. I could keep going. We were promised technology would improve our lives. We were promised borrowing money for education would improve our lives. And on and on and on.
All bullshit. All promulgated by supposed experts with impeccable credentials. It’s a full on disaster and the chickens have come home to roost.
https://www.cbsnews.com/news/arkansas-inmates-ivermectin-fed...
Hmm. I wonder how we got there...[1]
[1] https://i.imgflip.com/63b63m.jpg
Deleted Comment
Dead Comment
So what happens when you present a video testimonial like this - https://www.youtube.com/watch?v=L2GKPYzL_JQ - and people just dismiss it, people not understanding that with clinical trials you're meant to keep track of and report all outcomes, and not only that - by limiting the scope via an app that has limited options, and no free form input, you're crafting a very narrow scope of outcome, literally doctoring the outcome by minimizing the potential range and severity of adverse events? (If the people even watch the video or understand the implications of what Maddie's mother has shared; and if not, then you're not able to educate or inform them further - probably because it's a counter-mainstream narrative point, arguably the mainstream and government captured, regulatory capture from industrial complexes, etc)
Does the video described build or harm trust of our institutions? Does that this happened with this clinical trial, does that not automatically extrapolate that all of the other clinical trials were ignored, not given proper oversight, nor having integrity to them?
Given the level scientific institutions have been spending trust over the past few years I'm not sure how this problem can be alleviated within our current framework.
Why do you assume that the financial incentive is for the truth rather than for the conspiracy theory?
You can add any number of layers of scientific rigor and it won't convince someone who has a favorite talking head.
Yeah, because there's no rational basis to continue to afford them that trust. You're describing the symptom, not the disease.
But it's the only correct way, if credible studies contradict an official agency (more often it's the degree of the opinion where there is disagreement, not the opinion itself), my advice would be to trust the studies. For instance MRI contrast agents: the EMA banned most of them while the FDA did not, which is not the advice one would form by reading studies on the matter.
It is also difficult to come up with an example where it was warranted to suppress information. Can you name one? Educated and intelligent people are often very distrustful, especially of authority.
Society is not a boot camp of grunts.
(... And that's assuming it can be provided without being shut down as "indoctrination").
Deleted Comment
Deleted Comment
This misses a lot of the cause. People aren't listening on the vaccine because there were no WMDs in Iraq. All the "experts" said that too.
The fact that huge frauds have been perpetuated at scale against our entire society and nobody has been held to account is a major reason for collapsing trust. It would be very helpful if the parties responsible had at least been censured even if the political class couldn't muster the will to truly apply penalties, but few in the political class have even admitted that anything bad happened.
When trusted authorities perpetuate frauds or display extreme incompetence and then just paper it over (often at others' expense), they are destroying the fundamental trust basis of society. Opportunistic politicians are destroyers of civilization.
Being able to construct a graph does not mean it's possible to check whether the claims are correct though..
On the other hand, I don't disagree that the level of misinformation is ridiculous. It's difficult to understand why someone would trust someone with a popular podcast so much more than licensed professionals. Maybe there shouldn't be censorship, but where are the medical malpractice claims against these people? I think it's all tied in with the general acceptance of the "these statements have not been evaluated by the FDA" supplement industry. As long as your supplement does not do too much direct harm, we allow it.
I'm less convinced that this is a matter of trust, I think it's more about value systems and priorities.
I trust my dentist to recommend me the best filling or crown material for my teeth, even in the face of financial incentives.
But his advice does not tell me, in an absolute sense, whether it'd be worth me paying $1K for a crown if that means I miss the rent next month. It's not a tractable problem.
I can't speak to the US situation, but in the UK in my peer groups most coronavirus-related argumentation doesn't revolves around whether e.g. wearing a mask reduces the spread of coronavirus, but around whether wearing a mask is a net positive for the society and/or the wearer. The former is a matter for epidemiology, the latter is nowhere near clear cut.
This isn't actually a problem so long as those past errors were caused by the limits of what we knew at the time and efforts are made to help prevent similar issues in the future when it's possible. It's inevitable that as our understating of science and medicine evolve we're going to discover that what made sense before is no longer a good idea.
The problem comes when we weren't wrong because of what we didn't understand, but because people who knew better just thought they could get more money if they manipulated results or outright lied. We had the tobacco industry pay off scientists to lie about the cancer risks the industry knew to be a problem. The resulting rise in people with lung cancer wasn't a mistake. We had doctors pushing opioids on people at insane doses because they were paid kickbacks if they did. That wasn't a mistake either.
What we need is strict regulation and oversight so that when science and medicine do get it wrong, it's because we couldn't have known better given what data we had at the time. That'd be a huge step up from where we are now.
Deleted Comment
Who knows what damage that has done to my internals.
- it plastic no food
- it’s vegetable good
- it’s cholesterol bad
- it’s trans fat bad, butter no trans good
- cholesterol ain’t real, margarine good
I don’t care :p
- Accepted fact 2: eating trans fat is bad. A diet high in trans fats can contribute to obesity, high blood pressure, and higher risk for heart disease, because intake of dietary trans fat disrupts the body's ability to metabolize essential fatty acids. Trans fat is also implicated in Type 2 diabetes.
Historically margarine was low in saturated fat but high in trans fat due to partial hydrogenation - hence the questionable "health benefits" of it vs butter.
Currently in most of the developed world vegetable spreads are not allowed to contain significant amounts of partially hydrogenated oils, and thus margarine should be healthier - but there is a powerful dairy lobby so the bad reputation will last for a long while...
People often learn their understanding is flawed by saying something "stupid" and getting a response to that. Censorship fosters a climate of fear where people are less likely to say the "dumb" thing and get it explained.
A good policy with raising children is "There are no bad questions. You can ask (parent) anything and will not get in trouble, even if the answer is Wow, that's a really bad word and means (something bad). Please don't use that at school or I will get called by the teacher."
Arguably thesis by body of work is less corrosive, but the production of 3-4 papers from the PhD is basically now a signal to the university you can be that performing seal.
That, and IPR. "a new drug which xxx (in mice)" paper is worth significantly more to the company behind it during share price discussions. (Obviously this is shorthand because IPR has risks in premature publication as well)
> For decades, corporate undermining of scientific consensus has eroded the scientific process worldwide. Guardrails for protecting science-informed processes, from peer review to regulatory decision making, have suffered sustained attacks, damaging public trust in the scientific enterprise and its aim to serve the public good. Government efforts to address corporate attacks have been inadequate. Researchers have cataloged corporate malfeasance that harms people’s health across diverse industries. Well-known cases, like the tobacco industry’s efforts to downplay the dangers of smoking, are representative of transnational industries, rather than unique. This contribution schematizes industry tactics to distort, delay, or distract the public from instituting measures that improve health—tactics that comprise the “disinformation playbook.” Using a United States policy lens, we outline steps the scientific community should take to shield science from corporate interference, through individual actions (by scientists, peer reviewers, and editors) and collective initiatives (by research institutions, grant organizations, professional associations, and regulatory agencies).
UCS Case studies: https://www.ucsusa.org/resources/disinformation-playbook