I remember being very concerned that the fact checkers wouldn't be operating in a vacuum and we'd have a bunch of facebook execs being arbiters of truth.
And that appears to be exactly what has happened. This is terrifying. A handful of mainstream publications and a bunch of basically unknown facebook folks are now the censors. We've already seen well known EFF activists rated untrue for their arguments against warrantless surveillance, and now the censors already can't agree on climate change. I've seen arguments about Confederate generals rated as false because of modern news events - in ways that disagree with well tenured academic historians.
The problem is that there is a well-funded war on facts.
This isn't new either; the legacy version was the systematic denial that smoking caused cancer. This has been going on since the 1950s, and is still being litigated. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3490543/
The result of this is that any single point of fact-determination becomes a target. It's good of Facebook to take some kind of stand against the most fraudulent and racist articles that get circulated on there; the "blood libel" against Jews and so on. But now they're dragged into the other conflicts. There will be a lot of money available from Exxon-Mobil to deny climate change, for example.
Ultimately the concept of "fact" as something outside subjectivity will be destroyed. The rotation of the earth around the sun is really just your opinion, right? /s If heliocentricity were politically convenient or profitable there would be an astroturf campaign promoting it on facebook.
Ideally, facts could assert themselves. Unfortunately, humans are needed to assert facts, and therefor any judgements based on facts require human interpretation.
We can dissolve interpretations down as far as we like to leave as little room for bias and dishonesty (and that is a useful effort), but ultimately there will always be POWER in the hands of those who we abdicate arbitration authority to.
...and power corrupts.
The alternative is creating a diverse ecosystem of organizations/individuals who's analysis/logic people can choose to subscribe to and judge based on the reputation that they garner over time.
...and diversity of thought and analysis liberates.
I agree with your post in general but consider that accuracy and being correct is hard and takes time and consideration. This is something almost no people value when speaking in normal conversations or the equivalent online venues (Twitter, Facebook, reddit, HN, etc.). Take your post, for example, with its incorrect use of "heliocentricity". It means an astronomical model in which the Earth and planets orbit the Sun at the center of the Solar System. It is the accepted theory of planetary motion for centuries and has many different ways to be independently verified :). Not the best example of something incorrect that might be promoted with astroturfing.
Yup. This is why I actually agree with Zuckerberg on his “no censorship” approach.
He’s smart enough to recognize that taking on the burden of what should be censored and what shouldn’t is a quagmire he’d prefer to avoid. No matter how “good” your filters are, you’re going to piss off a massive group of customers, despite all the effort.
It’s just a massive time sink with little to no benefit.
I think it's fine for Facebook to opt out of fact-checking, as long as they also opt out of algorithmic syndication of content.
If my feed is going to have posts shared by others (even non-friends) or, worse, op-eds in it, then FB is vouching for whatever I see. If they share a link with 50 million people, they're certainly responsible for the consequences.
Worse is the ad side, where Facebook wants to continue to profit from well-funded wars against science while still claiming to be neutral.
If you get paid to broadcast a message and then broadcast it, you are not neutral.
> I actually agree with Zuckerberg on his “no censorship” approach.
But this is a complete nonsense! Facebook cheerfully and aggressively censors anything sexual, far over and above local concerns of legality. Even before FOSTA/SESTA.
But he already plays by having an algo that suggest you articles/news/groups to join. It would be one thing if it was just a dump of what your friends post and nothing else. But the moment you "suggest" content to people, you take an editorial stance. If you take an editorial stance and start pushing an article to millions of people, then you have a responsibility.
You can't have it both ways. This isn't a phone company.
> It’s just a massive time sink with little to no benefit.
There is arguably a large social benefit to preventing widespread misinformation being disseminated. Perhaps no great business benefit - indeed it is likely a massive expense. But someone somewhere is paying that cost, however it manifests.
I'm sure he'd prefer to avoid it, but Facebook has become influential enough in people's day-to-day information consumption that at this point, doing nothing is itself a stance in favor of the well-funded and well-connected.
Facebook can’t have it both ways. They can’t tell Congress they are like the telephone company (an agnostic communication channel) and then turn around and censor whatever they feel like. Either take responsibility for the content or stay out of it.
The whole point of section 230 of the CDA was to say that they can.
Before 230, there had been court rulings that said if a platform moderated user supplied content they were liable for that content. Section 230 was written to reverse those rulings.
I'm not sure why, but a myth has taken hold that under 230 you have to be neutral to get the liability shield.
This is a common misunderstanding that is often repeated despite being wrong. This is the exact opposite of Section 230, which says that you can have it both ways: You can moderate some but not all stuff without getting in trouble.
But this is exactly what the public seems to be demanding of Facebook - i.e. take responsibility for censoring your content, but only the content I disagree with. FB is in a no-win situation unless they can offload the content moderation to a government entity.
... And before that, there was local news and newspapers. Some time before that it might be your boss, your parents, or the local clergy.
This is no change, but business as normal. What has changed is that we actually have easy access to information outside of the keyholders. So when the government and some in society dismiss complaints about police brutality, we can have the retribution of actual video (an example with both historical and current examples), and it is so much harder to dismiss. Unfortunately, not everyone is willing to believe facts - but again, I'm not sure this is more true today either, since folks have been rejecting things like modern medicine and science and so on for generations if not centuries. We simply know about it more, thanks to - in part, for good and bad - facebook. And reddit and HN and every social place on the internet.
There's quite a difference between social media interjecting its opinion into your private, personal conversations and whatever the clergy at your local church back in the 90s would have said. They would not have even had visibility into say, what you might be talking about at the local pub or hanging out with your neighbors. A local newspaper might pitch a story, but editorials were clearly labeled and there were often competing newspapers in most towns. Statistically, police brutality isn't a problem and far more people suffer from the effects of too little policing than not (see murder rate in Chicago or Juarez). In this era, we suffer from magnification and amplification of the marginal. A thing that does not occur very often, statistically, can get blown completely out of proportion. The power of social media is the power of propaganda. Propaganda has been used throughout history to start wars, revolutions, religious battles, etc. It has the ability to shape the perception of reality in a way that has heretofore been unthinkable. Facebook alone has an audience and reach that no TV channel or newspaper has ever had. Quantity has a quality all its own.
I do think there are at least two material differences today.
One is that you no longer need "physical access" to someone to manipulate them. It is so much easier to get your message to someone today. You don't even need to buy a costly newspaper or TV network. You can be individually targeted from anywhere in the world relatively easily. The scale is completely different.
Second is that attackers are able to target specific people in a specific way thanks to the data collected by us online (I mean, your clergy doesn't have access to your search history). Your local clergy has to put forth a message that both appeals to an entire congregation and also fits in roughly with what is expected to be heard from clergy. The congregation can then talk to each other and decide if the message makes sense. Attackers going after a specific person don't have to deal with either of these.
Even in cases of clear video of an event, two people may come to opposing conclusions about what the video actually depicts. An example I saw on HN recently was the video from Seattle of a car hitting and killing two protestors on a highway. One commenter believed the video depicted reckless but ultimately unintentional manslaughter, while another believed it depicted premeditated murder. I think they were both sincere in their opinions, but they both watched the same video and can't both be right.
Another example are the videos of the collapse of the WTC7 building. Many people view those videos and perceive an intact building being intentionally demolished. I view those same videos and perceive a building gutted by debris and fire collapsing under it's own weight. We watch the same videos but come to opposite conclusions.
This reminds me of when people said that the DMCA would be used to silence critics, or DRM would be used to prevent people from fixing their own devices. In those other cases, too, the outcome was obvious and predicted, but they went ahead anyway.
We can only hope, pray, and take action[1] to try to prevent the same thing from happening with the government trying to force companies to build backdoors into encryption, a battle we have been fighting for literally decades[2].
And in fact, there is a huge difference between this era and that era---people who don't like Facebook's tone can use a different site (or run their own) for much lower cost and fewer governmental barriers to entry than the FCC represents to the television industry.
> I've seen arguments about Confederate generals rated as false because of modern news events - in ways that disagree with well tenured academic historians.
Let us be honest here. The people that have the job to sort out this stuff might also not be the brightest stars in the night sky. If I had vested interests where manipulation of social media nets me advantages, I would be very content with the situation.
"I remember being very concerned that the fact checkers wouldn't be operating in a vacuum and we'd have a bunch of facebook execs being arbiters of truth."
This isn't the hurdle - the reality of nuance is the hurdle.
Any one of us would have tremendous difficulty in parsing through the reams of various ways people can comment on the world around them.
It's an impossible problem. Not even a team of people watching your every post would produced consistent results.
"The earth increases in temperature by 0.1 degrees last year" - says John. Well ... there are a handful of different ways of indicating the 'Earths Temperature' so we're already dead in the water.
"The winters are colder where I am so looks like 'global cooling to me'" - says Judy. Well, if it is getting cooler where this person is, it would look like the Earth is cooling to a person with that narrow frame of reference.
If it was a war of 'scientific voices' with references to specific articles etc. this would be one thing, but we're talking about FB: casual discussion and propagation of 'water cooler banter'.
* Amendment proposed to not allow section 215 to be used to search internet history W/O warrant.
* Patriot act reauthorized W/O internet history protection amendment.
* PF rules "[Congress] voted for federal agencies to have access to your internet history without obtaining a warrant." is false because of semantic word games.
I'm curious about the confederate general thing, do you have a link? It's true there is gray area everywhere, I guess I have a hard time believing that Facebook would bother stepping into a genuine aspect of history, in an area that isn't really much disputed.
It seems to me that the problem lies not in the fact that our information is unreliable, but that the social contract has broken down. If you want to foster an environment where honestly is viable, you have to start from the very foundations of society.
Facebook said they don't want to be the arbiters of truth, so they're allowing climate change deniers to prevent their posts from being flagged as false, which they are.
Ironically, this news is terrifying if you have a very inaccurate view of human behavior. The following posts about "mindless masses" is so dismissive that I have to question the voracity of the analysis. People who use facebook are constantly faced with the dissonance of things found on facebook that do not correlate with reality. Even if FB was your one source of truth (there are no people who "only use facebook" or FB+Google, etc) they cannot inherently trust everything they read from day to day contradiction with their own community realities.
Not OP but I can imagine an example might look something like this: (Please note below is not my view, but a hypothetical)
Professor of history/analytical view of history may label Robert E. Lee a skilled tactician.
Someone on facebook posts a Robert E Lee post saying he was a skilled tactician.
'Skilled tactician' seeming to be positive stance on Robert E Lee running counter to the viewpoint that, as a confederate general, he was pure evil with no positive qualities, the story then gets flagged as false despite the historians assertions that he was a skilled tactician.
How is Facebook the arbiter of truth when many, many websites other than Facebook exist, and most human beings do not use Facebook?
This is a very different scenario than a single company owning every broadcast tower in a town, pre-internet.
The ability to access Facebook strongly implies general internet access, which is a compelling argument against Facebook exerting any sort of authority over general truth.
69% of people from the US use Facebook. In Europe, it's a little over 80%. Saying that "most humans" don't use it seems pointless at best, disingenuous at worst. If this platform becomes an arbiter of truth, it's clearly a massive problem across many huge portions of the globe.
Facebook has literally billions of monthly active users [0], close to two billion daily active users [1] and very likely a number of total accounts that's 10+ billion [2]. Instagram has another billion mau, and half a billion dau [3].
I struggle to think of any other private entity with that much global reach into so many peoples lives, the only other example that comes to mind is Google, the two of them combined have direct influence over 70%+ of internet traffic [4].
Staying clear of their impact is active work because even if you don't use their services, chances are many and often even most people around you still do.
Facebook is the 6th most popular website on the planet (behind Google, Youtube, and 3 Chinese sites). That puts them in a position of control over a great deal of the world's information diet. If they say something is false or true, many people will believe it. If they censor something, many people will never see it.
What a disgustingly terrible article. Never explains what precise facts Facebook is taking into consideration as opinion, just throws around political leanings and pointlessly polarizes it as "climate scientists" vs "climate change deniers".
This is the kind of journalism that only serves to divide the people even further by empowering their tribalism instead of trying to discuss the matter at hand.
More insidiously, it's another drumbeat for the idea that we need classical media gate keepers to invade people's personal space and "correct" any wrong-think. Very dangerous and unwelcome idea for a free society. The media has no valid purpose for having the power to amend private citizens conversations, we wouldn't tolerate at a bar and we shouldn't on our personal web space.
The greatest trick the devil ever pulled is making you think he doesn't exist.
Even without active censorship, Facebook very much is a gate keeper invading your personal space. It's just rather than building up walls around you, it's merely giving you a shovel and showering you with dopamine bursts every time you dig yourself down into your own personal hole. The end effect is the same, though -- all you hear is echos bouncing off walls around you. You just think the echos are only your own.
I agree a free society shouldn't tolerate amending private citizen's conversation. However, that comes with the responsibility of recognizing the more insidious ways of doing that, such as when your attention has been hijacked with industrialized behavioral profiling. The "private citizen conversations" you hear bouncing off your walls are being algorithmically seeded by whatever brings in the most dollars the most efficiently. That's not a truly free society, that's just a very subtle form of mass indoctrination.
The devil already exists out there whether you think he does or not. And when you play with the devil, the only winning move is to not play his game.
>More insidiously, it's another drumbeat for the idea that we need classical media gate keepers to invade people's personal space and "correct" any wrong-think. Very dangerous and unwelcome idea for a free society.
We desperately need some mechanism for correcting bad thinking based on falsehoods. You can believe whatever you want if your beliefs have no effect on me, but the world has become so complex and interconnected that it's very difficult to point to something and say "your opinion on this has no consequences for other people."
Vaccines? Herd immunity is effected. Not vaccinating your child has little impact on yourself, but if enough people do it, it effects the entire civilization. What should be done about this?
This article is about one of the biggest, climate change.
Every political issue is shaded with the brush of ideology, and that has come at the cost of actual facts and thinking. There is the old saying "strict adherence to rules is nothing more than an excuse to avoid thinking." The same thing can be said to ideology.
It is natural that the mediums we have for disseminating information should have responsibility for the integrity of that information.
But I am looking forward to hearing what you would suggest such a mechanism would be.
The analysis is, ironically, guilty of the same cherry-picking it accuses the original article of. The checkers identify two false claims and two misleading ones out of Shellenberger's bulleted list of twelve "facts few people know". But the summary never engages with the article's core claim, that climate change won't be the end of the world or of human society even though many in the general public think it will. The first checker explicitly agrees with that claim!
There's also a larger question here. If claims which dispute scientific consensus are fact-checkably false, what does that mean for scientific debate? If Facebook had been created in the 60s, would Freudianism have been forever locked in as an effective form of psychiatry?
The "reasonable analysis" is hilarious. E.g.: climate change is affecting wildfires, BUT fires have decreased. Says a Geography professor from Swansea.
This kind of stuff only serves to create more doubt about climate science and, unfortunately among some people, science in general.
Would be nice to know what the actual claims are. I'm not at all a climate change denialist, but some of the things claimed by people trying to raise awareness for climate change are outright false, or at least extremely unlikely, e.g. humans almost certainly aren't going to go extinct because of climate change. But it can still be catastrophic at a society level.
It's a shame the issue is so polarized, because while the claims that the earth is warming and that the warming is at least in part caused by c02 and other forms of pollution are solid, some of the assumptions going into the models are questionable and can be legitimately debated.
The problem is there are three groups: climate deniers, climate alarmists, and climate realists. The first two are much louder than the third, so the debate quickly turns into an extremist yelling match full of misinformation in the public arena. Most people never settle into the minority middle ground of climate realist, because it can be difficult to muddle through the extremist views to get to the truth. And even if you do, the majority of people will just make it a point to tell you how wrong you are anyway.
Here's a more helpful one: people who actually have expertise and peer-reviewed publications, people who at least pay attention to the above, and everyone else.
This problem was created by social media by choosing which articles to show in your news feed based on engagement. That is, how much you'll emotionally react to it.
The solution is not to add another layer of "fact checking". That's super dubious.
Get rid of what got us here.
News feeds organized by engagement are evil. Dismantling them would solve both the production and consumption of emotion driven information.
Traditional news media is also organized by engagement: an entire department is responsible for picking the home page of a newspaper and importance of each story.
The reason it became so pronounced is that social media got very good at it and then taught the lesson to traditional media. They, too, now use metrics of engagement to choose what they show.
It is true also that it is a weakness of people. But there are other weaknesses that we regulate in the interest of a healthy society.
I'm not a fan of regulation but a general awareness of the core issue would go a long way towards a solution.
Right now, the talk about the issue seems misdirected. I can only see the proposed solutions putting immense power of controlling narrative in a few hands.
I don't understand why it's Facebook's role to teach people. Maybe if regulations were instead imposed on school curriculums we wouldn't have dumb adults arguing about climate change, flat earth or mask wearing. Combat issues at the root not the fruit.
Facebook endumbens people. It's there front and center of the stupidification process, whether it wants to be or not. And it wants to be: it's aware that the mechanisms it uses to attract eyeballs (and thus revenue) favor lies and propaganda over facts. Facts are boring. Proof that your ideological enemies are out to get you is endlessly fascinating.
They want to have it both ways, to remain sticky but without having the world become a worse place for it. I'm sure they'd love it if schools didn't also contribute to the increasing gullibility of the US, but that's not something they can control. Their own feeds, by contrast, is something they have power over.
It isn't! When Facebook announced the program [1], they made it clear that they had no interest in becoming arbiters of truth; fact checking was about preventing hoaxes, stories like "5G causes coronavirus" that are just completely and unambiguously made up. The expectation that Facebook fact checkers should intervene in live political controversies was inevitable IMO, but I don't think many people would explicitly endorse the idea that Facebook has to teach people good climate science.
According to the piece, the fact checkers claimed an opinion piece was false because all of the facts in it were cherry picked:
"The researchers found that the post by the CO2 Coalition was based on cherry-picked information to mislead readers into thinking climate science models are wrong about global warming."
This is not a logical argument. Facts are correct or incorrect, whether they're cherrypicked or not.
cherry picking truths and listing them out of context (i.e. wildfire reduction without mentioning land usage changed) may not be fake news, but it definitely smells like propaganda
not defending Facebook actions, which are wrong for a different set of motives detached from the actual article quality, but this isn't the censorship smoking gun people want it to be
Question: facebook as a platform is not liable for information it contains. However if Facebook would publish fact checking info, that is wrong or missleading, are they liable?
Could I sue facebook if their "facts" were proven wrong, and cause me damage?
You can sue anyone for everything. Do you stand a chance? No.
The platform/publisher dichotomy is mostly just a right-wing talking point with absolutely no basis in law. As your example alludes to, a website can well be both at the same time.
For text that is Facebook’s own editorial content, the liability standard generally applied to any speaker, but historically tested in relation to news media and book publishers would apply, which is one of “malicious intend”: you need to prove that they wanted to cause the harm your suffered.
This is a rather high standard, but it follows from the US’s regard for free speech. You are likely to encounter additional difficulties trying to prove and quantify the specific damages you incurred.
But Facebook’s fact checking is set up somewhat differently, with third parties doing these checks. It seems almost impossible to make a case for intent against FB for a correction authored by any of these organizations.
Facebook is an international company and should comply with local laws, if it wants to do local advertising.
My country bans propagation of certain ideologies, due to our history. For example Heineken can not use its logo here. It would be criminal offense, not civil lawsuit.
Another possible problem is anti free Hong Kong propaganda and lattest presidential decree.
I dont really care about politics. Just showing it is incredibly thin ice.
And that appears to be exactly what has happened. This is terrifying. A handful of mainstream publications and a bunch of basically unknown facebook folks are now the censors. We've already seen well known EFF activists rated untrue for their arguments against warrantless surveillance, and now the censors already can't agree on climate change. I've seen arguments about Confederate generals rated as false because of modern news events - in ways that disagree with well tenured academic historians.
We seem doomed in this regard.
This isn't new either; the legacy version was the systematic denial that smoking caused cancer. This has been going on since the 1950s, and is still being litigated. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3490543/
The result of this is that any single point of fact-determination becomes a target. It's good of Facebook to take some kind of stand against the most fraudulent and racist articles that get circulated on there; the "blood libel" against Jews and so on. But now they're dragged into the other conflicts. There will be a lot of money available from Exxon-Mobil to deny climate change, for example.
Ultimately the concept of "fact" as something outside subjectivity will be destroyed. The rotation of the earth around the sun is really just your opinion, right? /s If heliocentricity were politically convenient or profitable there would be an astroturf campaign promoting it on facebook.
(A subgenre of this is libel lawsuits, which are much less of a problem in the US due to "anti-SLAPP" laws: https://www.medialaw.org/topics-page/anti-slapp?tmpl=compone... but may be a significant problem in the UK; e.g. https://www.theguardian.com/media/2009/dec/17/bbc-trafigura and https://en.wikipedia.org/wiki/McLibel_case , lasting over a decade)
We can dissolve interpretations down as far as we like to leave as little room for bias and dishonesty (and that is a useful effort), but ultimately there will always be POWER in the hands of those who we abdicate arbitration authority to.
...and power corrupts.
The alternative is creating a diverse ecosystem of organizations/individuals who's analysis/logic people can choose to subscribe to and judge based on the reputation that they garner over time.
...and diversity of thought and analysis liberates.
He’s smart enough to recognize that taking on the burden of what should be censored and what shouldn’t is a quagmire he’d prefer to avoid. No matter how “good” your filters are, you’re going to piss off a massive group of customers, despite all the effort.
It’s just a massive time sink with little to no benefit.
“The only way to win is to not play at all.”
If my feed is going to have posts shared by others (even non-friends) or, worse, op-eds in it, then FB is vouching for whatever I see. If they share a link with 50 million people, they're certainly responsible for the consequences.
Worse is the ad side, where Facebook wants to continue to profit from well-funded wars against science while still claiming to be neutral.
If you get paid to broadcast a message and then broadcast it, you are not neutral.
But this is a complete nonsense! Facebook cheerfully and aggressively censors anything sexual, far over and above local concerns of legality. Even before FOSTA/SESTA.
(example list from 2018 of censored FB content rules: https://www.cnbc.com/2018/04/24/facebook-content-that-gets-y... )
You can't have it both ways. This isn't a phone company.
There is arguably a large social benefit to preventing widespread misinformation being disseminated. Perhaps no great business benefit - indeed it is likely a massive expense. But someone somewhere is paying that cost, however it manifests.
The Rohingya massacre in Myanmar begs to differ.
https://www.nytimes.com/2018/10/15/technology/myanmar-facebo...
Before 230, there had been court rulings that said if a platform moderated user supplied content they were liable for that content. Section 230 was written to reverse those rulings.
I'm not sure why, but a myth has taken hold that under 230 you have to be neutral to get the liability shield.
This is no change, but business as normal. What has changed is that we actually have easy access to information outside of the keyholders. So when the government and some in society dismiss complaints about police brutality, we can have the retribution of actual video (an example with both historical and current examples), and it is so much harder to dismiss. Unfortunately, not everyone is willing to believe facts - but again, I'm not sure this is more true today either, since folks have been rejecting things like modern medicine and science and so on for generations if not centuries. We simply know about it more, thanks to - in part, for good and bad - facebook. And reddit and HN and every social place on the internet.
One is that you no longer need "physical access" to someone to manipulate them. It is so much easier to get your message to someone today. You don't even need to buy a costly newspaper or TV network. You can be individually targeted from anywhere in the world relatively easily. The scale is completely different.
Second is that attackers are able to target specific people in a specific way thanks to the data collected by us online (I mean, your clergy doesn't have access to your search history). Your local clergy has to put forth a message that both appeals to an entire congregation and also fits in roughly with what is expected to be heard from clergy. The congregation can then talk to each other and decide if the message makes sense. Attackers going after a specific person don't have to deal with either of these.
No single partisan local newspaper editor had the kind of reach that facebook has.
Another example are the videos of the collapse of the WTC7 building. Many people view those videos and perceive an intact building being intentionally demolished. I view those same videos and perceive a building gutted by debris and fire collapsing under it's own weight. We watch the same videos but come to opposite conclusions.
We can only hope, pray, and take action[1] to try to prevent the same thing from happening with the government trying to force companies to build backdoors into encryption, a battle we have been fighting for literally decades[2].
[1] https://act.eff.org/action/stop-the-earn-it-bill-before-it-b... [2] https://en.wikipedia.org/wiki/Clipper_chip
It's not much different than when the TV news was controlled by 3 networks.
Very different.
Do you have a source for this claim?
Why not?
Dead Comment
This isn't the hurdle - the reality of nuance is the hurdle.
Any one of us would have tremendous difficulty in parsing through the reams of various ways people can comment on the world around them.
It's an impossible problem. Not even a team of people watching your every post would produced consistent results.
"The earth increases in temperature by 0.1 degrees last year" - says John. Well ... there are a handful of different ways of indicating the 'Earths Temperature' so we're already dead in the water.
"The winters are colder where I am so looks like 'global cooling to me'" - says Judy. Well, if it is getting cooler where this person is, it would look like the Earth is cooling to a person with that narrow frame of reference.
If it was a war of 'scientific voices' with references to specific articles etc. this would be one thing, but we're talking about FB: casual discussion and propagation of 'water cooler banter'.
I don't think anyone has a solution.
Got a link about that?
* Patriot act is up for renewal
* Amendment proposed to not allow section 215 to be used to search internet history W/O warrant.
* Patriot act reauthorized W/O internet history protection amendment.
* PF rules "[Congress] voted for federal agencies to have access to your internet history without obtaining a warrant." is false because of semantic word games.
https://www.politifact.com/factchecks/2020/may/21/facebook-p...
In other words, they're being arbiters of truth.
I share your concern. And, hasn't this been the case for as long as there have been centralized publications?
Ironically, this news is terrifying if you have a very inaccurate view of human behavior. The following posts about "mindless masses" is so dismissive that I have to question the voracity of the analysis. People who use facebook are constantly faced with the dissonance of things found on facebook that do not correlate with reality. Even if FB was your one source of truth (there are no people who "only use facebook" or FB+Google, etc) they cannot inherently trust everything they read from day to day contradiction with their own community realities.
Professor of history/analytical view of history may label Robert E. Lee a skilled tactician.
Someone on facebook posts a Robert E Lee post saying he was a skilled tactician.
'Skilled tactician' seeming to be positive stance on Robert E Lee running counter to the viewpoint that, as a confederate general, he was pure evil with no positive qualities, the story then gets flagged as false despite the historians assertions that he was a skilled tactician.
Lets open a ministry for that, guess the name.
In a vibrant marketplace of ideas, a "Ministry of Truth" is just "quality standards."
This is a very different scenario than a single company owning every broadcast tower in a town, pre-internet.
The ability to access Facebook strongly implies general internet access, which is a compelling argument against Facebook exerting any sort of authority over general truth.
Facebook has literally billions of monthly active users [0], close to two billion daily active users [1] and very likely a number of total accounts that's 10+ billion [2]. Instagram has another billion mau, and half a billion dau [3].
I struggle to think of any other private entity with that much global reach into so many peoples lives, the only other example that comes to mind is Google, the two of them combined have direct influence over 70%+ of internet traffic [4].
Staying clear of their impact is active work because even if you don't use their services, chances are many and often even most people around you still do.
[0] https://www.statista.com/statistics/264810/number-of-monthly...
[1] https://www.statista.com/statistics/346167/facebook-global-d...
[2] https://youtu.be/nmXACRrzLMA
[3] https://www.omnicoreagency.com/instagram-statistics/
[4] https://staltz.com/the-web-began-dying-in-2014-heres-how.htm...
This is the kind of journalism that only serves to divide the people even further by empowering their tribalism instead of trying to discuss the matter at hand.
Even without active censorship, Facebook very much is a gate keeper invading your personal space. It's just rather than building up walls around you, it's merely giving you a shovel and showering you with dopamine bursts every time you dig yourself down into your own personal hole. The end effect is the same, though -- all you hear is echos bouncing off walls around you. You just think the echos are only your own.
I agree a free society shouldn't tolerate amending private citizen's conversation. However, that comes with the responsibility of recognizing the more insidious ways of doing that, such as when your attention has been hijacked with industrialized behavioral profiling. The "private citizen conversations" you hear bouncing off your walls are being algorithmically seeded by whatever brings in the most dollars the most efficiently. That's not a truly free society, that's just a very subtle form of mass indoctrination.
The devil already exists out there whether you think he does or not. And when you play with the devil, the only winning move is to not play his game.
Deleted Comment
We desperately need some mechanism for correcting bad thinking based on falsehoods. You can believe whatever you want if your beliefs have no effect on me, but the world has become so complex and interconnected that it's very difficult to point to something and say "your opinion on this has no consequences for other people."
Vaccines? Herd immunity is effected. Not vaccinating your child has little impact on yourself, but if enough people do it, it effects the entire civilization. What should be done about this?
This article is about one of the biggest, climate change.
Every political issue is shaded with the brush of ideology, and that has come at the cost of actual facts and thinking. There is the old saying "strict adherence to rules is nothing more than an excuse to avoid thinking." The same thing can be said to ideology.
It is natural that the mediums we have for disseminating information should have responsibility for the integrity of that information.
But I am looking forward to hearing what you would suggest such a mechanism would be.
https://environmentalprogress.org/big-news/2020/6/29/on-beha...
And here's a reasonable analysis of what it gets wrong: https://sciencefeedback.co/evaluation/article-by-michael-she...
Flagging it as misleading seems fair.
I cannot find the Washington Examiner oped in question.
There's also a larger question here. If claims which dispute scientific consensus are fact-checkably false, what does that mean for scientific debate? If Facebook had been created in the 60s, would Freudianism have been forever locked in as an effective form of psychiatry?
This kind of stuff only serves to create more doubt about climate science and, unfortunately among some people, science in general.
Deleted Comment
It's a shame the issue is so polarized, because while the claims that the earth is warming and that the warming is at least in part caused by c02 and other forms of pollution are solid, some of the assumptions going into the models are questionable and can be legitimately debated.
Here's a more helpful one: people who actually have expertise and peer-reviewed publications, people who at least pay attention to the above, and everyone else.
Dead Comment
Unless these labels are ones that people apply to themselves, you're being pretty unhelpful by labeling people.
https://www.dailywire.com/news/shellenberger-on-behalf-of-en...
The solution is not to add another layer of "fact checking". That's super dubious.
Get rid of what got us here.
News feeds organized by engagement are evil. Dismantling them would solve both the production and consumption of emotion driven information.
The major publications do a better job at not sensationalizing everything, but this has been a problem since 1890 at least: https://en.wikipedia.org/wiki/Yellow_journalism
That social media naturally tends to become a tabloid magazine may be more a reflection on readers than the editor (whether human or machine).
The reason it became so pronounced is that social media got very good at it and then taught the lesson to traditional media. They, too, now use metrics of engagement to choose what they show.
It is true also that it is a weakness of people. But there are other weaknesses that we regulate in the interest of a healthy society.
I'm not a fan of regulation but a general awareness of the core issue would go a long way towards a solution.
Right now, the talk about the issue seems misdirected. I can only see the proposed solutions putting immense power of controlling narrative in a few hands.
Thank you for protecting me Facebook gosh darnit!
They want to have it both ways, to remain sticky but without having the world become a worse place for it. I'm sure they'd love it if schools didn't also contribute to the increasing gullibility of the US, but that's not something they can control. Their own feeds, by contrast, is something they have power over.
Facebook isn't making anyone dumb. They were already dumb; you just didn't notice before.
[1] https://www.theverge.com/2016/12/15/13960062/facebook-fact-c...
"The researchers found that the post by the CO2 Coalition was based on cherry-picked information to mislead readers into thinking climate science models are wrong about global warming."
This is not a logical argument. Facts are correct or incorrect, whether they're cherrypicked or not.
not defending Facebook actions, which are wrong for a different set of motives detached from the actual article quality, but this isn't the censorship smoking gun people want it to be
Could I sue facebook if their "facts" were proven wrong, and cause me damage?
Per the landmark ruling NYTimes v Sullivan — no, you'd probably lose that case.
The platform/publisher dichotomy is mostly just a right-wing talking point with absolutely no basis in law. As your example alludes to, a website can well be both at the same time.
For text that is Facebook’s own editorial content, the liability standard generally applied to any speaker, but historically tested in relation to news media and book publishers would apply, which is one of “malicious intend”: you need to prove that they wanted to cause the harm your suffered.
This is a rather high standard, but it follows from the US’s regard for free speech. You are likely to encounter additional difficulties trying to prove and quantify the specific damages you incurred.
But Facebook’s fact checking is set up somewhat differently, with third parties doing these checks. It seems almost impossible to make a case for intent against FB for a correction authored by any of these organizations.
My country bans propagation of certain ideologies, due to our history. For example Heineken can not use its logo here. It would be criminal offense, not civil lawsuit.
Another possible problem is anti free Hong Kong propaganda and lattest presidential decree.
I dont really care about politics. Just showing it is incredibly thin ice.