A common argument in philosophy is that bad ideas should occur in the open - where everyone may ridicule them. When they are cast to the shadows they can grow, under everyone's noses, in private where only there are they allowed to exist. In private, these bad ideas cannot be challenged by others and people will be convinced to believe in them, with nobody challenging the idea as a genuinely terrible idea.
It gives bad ideas a breeding ground to foster and you'd never even know about it. By publicly prohibiting speech all that happens is it is brushed under a rug and people pretend it doesn't exist. That doesn't make the problem go away, it's the same as shutting your eyes and covering your ears and pretending the monster in front of you no longer exists. If you get to define certain speech as unspeakable you justify the censorship of any speech determined to be unspeakable when you lose power and someone else steps up to the plate. As history has shown - it's a matter of "when" not "if".
Actions like this serve to placate the public. The general public won't be angry at Microsoft for censoring platforms "known for" or "complacent of" hate speech (or worse). It's always a slippery slope though. Once you open up the can of worms that is censorship you justify future censorship.
I wait to see what will be next on the chopping block.
Edit: A few too many responses to respond to everyone, but if anyone would like to speak with me on it more in depth feel free to email me - information on how to contact me can be found on my profile.
That argument works well in relatively small social groups where people generally can be relied upon to act in good faith, and where there are social disincentives to clinging to bad ideas after they have been refuted.
When you have a small group with social bonds, you don't have to worry about a bad actor constantly repeating their already-refuted arguments to people who haven't heard the refutation.
But when you have the whole internet as your audience, no matter how many times you are refuted, you can say it again and a whole group of people will be hearing it for the first time.
I have been on the Internet since the 80s. What I have personally found is that a lot of its social norms were lifted from academia, and they work very well as defaults in a new social media context. Like Hacker News when it was first launched.
But eventually, every "social medium" on the Internet has its "Eternal September," when the number of new folks in any conversation outweighs the old hands. At that point, the social norms that worked for the small, cohesive group no longer work.
And worse, there are specialist parasites who exploit people's reluctance to change their social norms, and the act in outrageously bad faith.
Sites like HN have survived by changing. For all of the popularity on HN of "unrestricted free speech," HN is actually moderated, and that's why it works.
People are better off hearing a bad idea and hearing it refuted then never hearing a bad idea. You could say, "well then detractors of the idea will have to keep refuting it" and you'd be right. That's how public forums work, and I hope you don't take everything your parent's believed for granted just because their detractors have already been "refuted".
You realize that your argument is an argument against democracy, right?
If mere exposure to bad ideas is this dangerous then mass scale democracy is not viable. The viability of democracy beyond very small groups is dependent on the idea that human beings are capable of independent rational thought and of discussing and evaluating ideas on merit. If that's not the case then the neo-reactionaries are right and democracy is doomed to failure or worse.
What I see here is a major freakout over Trump. IMHO feeling alarmed about Trump is justified, but throwing basic freedoms and the idea of the open flow of information out the window is not the right response. The right response is to ask how and why Trump won and address the problems, ideas, and arguments that led to it rationally. The solution is also to learn how to use this new medium better. If Jones' BS is more "viral," then more rational thinkers need to learn how to play the viral game. You must learn to communicate effectively using the media of the time. If you use social media you must learn to be viral just like previous generations had to study the art of the persuasive essay or how to do makeup properly so you don't look like a sweaty pig on a TV interview (see Nixon vs. Kennedy TV debates).
If it really is true that bad ideas are inherently more viral than good ones and that cannot be combatted, I do not see how democracy can survive at all in the Information Age.
> But when you have the whole internet as your audience, no matter how many times you are refuted, you can say it again and a whole group of people will be hearing it for the first time.
Why is that a problem? You can't force people to believe in reality. Might as well have this process happen in the open to actually provide a disincentive to believing conspiracy theories/propaganda/ads/fake news/whatever.
Hell, I'll go as far to say that controversial/commonly mis believed concepts are the most worth discussing in a public forum. For instance, consider white supremacy: if you give people a choice between social groups and make them feel welcome, they can use simple, self-serving logic to join the one that meets their needs. White supremacy doesn't help white people, rationally, it's just stupid, except for very convoluted goals. See: https://www.nytimes.com/2017/08/22/podcasts/the-daily-transc...
If you don't enable conversations between white supremacists and anti-racists, there is no real choice--they'll gravitate the group where they're accepted.
When you say it’s like this one way but not the other, you have conceded to relativism. You have basically said there is a time and place for stealing someone’s wallet, and that acting as a moral agent in society can be suspended under the right conditions.
There is a difference between thinking that a certain social medium, "works" because of moderation (such as HN), and between thinking that NO uncensored platform should ever exist.
I enjoy lots moderated platforms. But I still think it is important that uncensored platforms like gab exist at all.
> A common argument in philosophy is that bad ideas should occur in the open - where everyone may ridicule them.
Not to get too far off topic, but is this actually true in practice in the real world? Scientific studies have shown that the idea that gets repeated the most often is the idea that "wins". A bad idea gets repeated and spread every time it gets criticized. If the idea proves more "credible" than the criticism of it, then all that criticism does is contribute to the spread of the idea it's trying to kill.
I would like to know this as well. Case in point, when I was a kindergarten teacher, kids wouldn't have a concept of race. I'd have white girls asking me if I could put their hair in braids like "the tan girls."
But by 4th grade I'd be teaching about slavery, racism, etc, and the kids would be then learning about a concept that was utterly nonsensical to the non-racism-indoctrinated kids. So I'm wondering what would happen if I never taught about racism at all.
FWIW, I think the real question is whether or not people’s inclinations can really be changed by exposure to information. To me is seems more likely that people seek out the information which confirms their world view. And if that’s true, then I think the imperative to shut down hate speech is greatly undermined.
You can never stop bad ideas from spreading, but you can decide if you'll ever hear about them or have the opportunity to fight them. If the most repeated idea wins but bad ideas are forced to spread in hidden places without refute then bad ideas are the only ones getting any voice and they win by default.
This called the Rebound effect[0]. There is a Debunking Handbook[1] for dealing with this phenomenon.
There is also the idea that a lie travels around the globe while the truth is putting on its shoes, which spurred /r/AskHistorians to ban holocaust denial[2].
That common argument rests on norms established by millennia of relatively difficult communication. Best not to assume that the same people would come to the same conclusions today.
For example, QAnon believers have not had their speech restricted. That conspiracy theory is as open as anything you could name. People have challenged it. It's still going.
If you believe that challenging terrible ideas makes those who hold them think twice... why hasn't the exact argument you're using gotten rid of the terrible idea that we should restrict speech? Answer: because logical refutation doesn't work.
I am not arguing against the slippery slope. This is a very, very hard and risky problem. I am refuting the misconception that shedding light on bad ideas makes them go away.
It can work, if-and-only-if if it is done after real communication has been established. Most people trying to refute "obviously wrong" ideas skip the first step and start with arguing in a language and framing that they understand. This is noise when the other person uses a fundamentally different framework for what counts as "facts", "evidence", "authority", etc.
One of the better explanations of what I'm talking about is this[1] essay. While it discusses the creationist "debate", the ideas about trying to educate people that are resistant to science and logic apply generally.
Yes, logical refutation rarely works. However, did you bother to learn their language so there is a foundation of actual communication to build upon? Or did you argue in a language they don't understand? Even worse, did your argument appear to contain a lot of phatic expression that is hostile from their point of view?
>I am refuting the misconception that shedding light on bad ideas makes them go away.
It doesn't make them go away. It allows them to be refuted - which hopefully stems at least some of the indoctrination that would otherwise occur only in private.
If everyone I've ever known tells me vaccines are evil and I never am told or shown otherwise - I'll grow up continuing to think vaccines are evil. If you refuse to let me speak about vaccines that only further proves that you're evil and trying to hide the truth - and strengthens my indoctrination. Now, poor arguments against anti-vaccers also serve to strength the indoctrination so there's some bad with the good... (Also just to be certain on things: "you" and "I" are used generically for easier writing, I support and believe in vaccines.)
There are also plenty of bad ideas that do die to public scrutiny - think of anytime anyone has ever been talked out of "doing something stupid" because they shared what they had planned to do.
Say we remove Qanon from all platforms of reputable note. Do you believe 4chan would ever comply?
Secondly, what impact does that have on the reputable platforms vs 4chan?
How does that all break down for people?
My thoughts are:
1. 4chan will continue very liberal speech. People will continue to visit, and those numbers will increase due to the Streisand effect.
2. The number one and two responses to Alex Jones recent and wide spread ban are:
Who is next?
FINALLY!
And those trend along authoritarian lines most clearly. Anti authoritarians are more critical and make it clear they do not need to be protected from bad speech.
Authoritarians make it clear they value said protection because of what could happen.
I bet the same happens with QAnon, should that decision get made.
Net outcome:
Polarization along authoritarian lines.
Frankly, I think Jones is laughable. I also think specific speech he made should be removed because it is defamatory and or inciting in nature.
I think a blanket, "fuck this guy" is a grave error.
I think that because:
The answer to bad speech is more speech, and post removal is a form of speech, given it is used where speech is actually criminal. And it is understandable too.
Bans are not those things and can create martyrs as well as augment attention to other platforms where fans, supporters will quickly gather to discuss injustice.
I think the laughable should be laughed at too.
Civility is difficult here. Forcing that, as in nobody gets offended, dilutes speech down, and prevents "more speech" from working as it should.
That all forms a resonant cycle we probably will not like very much.
At the root of all this is forced trust and an abandonment of safe harbor. That will not be easy to undo, and it will divide people into those authoritarian vs non authoritarian camps too.
I gotta be honest and will gladly side with the anti authoritarians.
Do not need the protection, and I have a very, very thick skin. There are few ways to actually impact me, and I do understand my agency in conversation well.
I can't control others, but I can control me. Bad speech from clowns and asses is entertainment. It isnt something I give weight to. Nobody should.
That they do, and that they are enabled with this idea of protection being somehow needed is likely to do more harm than good.
Anyway, the Qanon types will find and communicate where the speech is, just as the Alex Jones types will. And then they will share that and martyr status should prove effective in rebuilding the lines of communication, only now we have doubt.
Who is next and why?
How do people know the difference between corruption and abuse of power / money from a genuine act to manage criminal speech properly?
What of things like profanity?
More basically, how do people understand they are having a real conversation as opposed to just a permitted, even encouraged one?
I want the real conversation and will seek that. Got no time for Disneyland type conversation.
I feel as though the "sunlight is the best disinfectant" argument has taken a beating in recent times, and it's not difficult to see why.
Take the example of QAnon. It's a movement based around utter nonsense conspiracy theory. It is debunked on a daily basis. It still has a lot of followers.
Compare to Milo Yiannopoulos. I remember around the time his social media accounts were banned that people said doing so was pointless, and that banning him would only increase his popularity by making him look like an outlaw. Yet his profile has sunk almost entirely.
I feel like there's an elephant in the living room nobody is seeing, and that's general loss of trust in our institutions. These kinds of ideas have more traction today than they did 20, 50, or 100 years ago because people trust their government and their public institutions less today than they did back then.
IMHO a certain amount of this calling for censorship is people from those self-same institutions refusing to look at their own role in undermining the trust basis of our society. How many mainstream journalists bothered to check on those Iraq WMD claims or investigate the realities behind the 2008 financial collapse? How many politicians?
Trust is hard to earn and easy to squander. When it is lost it leaves a vacuum. If nobody is trusted the vacuum gets filled by whatever random nonsense sounds superficially good to someone.
To me the popularity of crazy populist demagogues is entirely explainable as a response to several large scale and very notable cases where our institutions have burned their credibility. If something isn't done to correct this and start re-earning trust I predict that this situation will only worsen and we'll be facing full-blown fascism shortly.
Censorship does not re-earn trust. In fact it does the opposite. Right now hordes of people out there are saying "see! Alex Jones must be onto something or why would they go to the trouble of banning him?!?" Microsoft's threat to ban Gab is the kind of PR money can't buy. I bet their traffic numbers will skyrocket.
This has been my position as well for a long time, but I'm starting to feel like there's an unexpectedly enormous number of people out there with zero ability to critically evaluate information, and it seems as if all a terrible idea needs in order to gain critical mass is that it be emotionally satisfying and generally known.
I honestly don't know where we go from here - I continue to believe that giving the state the right to determine what speech is allowed and what's not to be far more dangerous than the alternative, and that's not likely to change. But it's almost as if we've discovered that idiocy is contagious, and the vector is the internet. Maybe it's temporary, because the internet is new - maybe society, having never dealt with a deluge of information like this before has no natural immunity to it, and it will develop over time. Or maybe we've found a loophole in human cognition that will end up destroying us.
At any rate, I can't fault a private company for trying to do something about it; it's like a bus company finding out their seats are the vector for some horrible disfiguring viral disease and removing them in an attempt to stop the epidemic.
The alt-right playbook (https://www.youtube.com/watch?v=CaPgDQkmqqM) is a great summary of common experiences with this sort of cancer online. It's next to impossible to engage with these people for rational debate, when all they care about is spreading their hatred and ignorance through the virality of the internet, rendering pointless any civil discussion by arguing in bad faith.
> I'm starting to feel like there's an unexpectedly enormous number of people out there with zero ability to critically evaluate information, and it seems as if all a terrible idea needs in order to gain critical mass is that it be emotionally satisfying and generally known.
Aristotle observed this in the 4th century BC. He wrote about it in his book Rhetoric. This is by no means a new development.
This is abstract idealism, uninformed by contact with the real world. Mobs happen. Moral panics happen. People are not at all, not at all, rational beings evaluating the logical correctness of arguments and using critical thinking to vet new ideas.
The common spaces have to be policed. People with bad intent have to be dealt with not left for everyone to deal with on their own. It is entirely possible to destabilize a society sufficiently for the social contract to break down simply by allowing bad actors to intentionally poison the social commons. You do not want to see that outcome if you are a non-sociopath human.
Experience seems to indicate the opposite I think. The danger of niche mobs and/or terrorism is very real and scary, but losing our freedoms in order to address these perceived dangers is always far worse. For all the fear the "alt-right" has generated, exactly how many people have been come to harm because of them? They probably number in the dozens, which is a terrible tragedy but ultimately is that worth losing our fundamental rights? I personally don't think it is.
It is a common concept, true. It’s also wrong and not supported by any studies. The reason the idea keeps going is that it keeps getting repeated despite being comprehensively debunked, thus serving as its own contradiction.
(Those entertained by intellectual diversions will observe that if you assume that I am incorrect and that in fact the evidence supports OP, you end up with a logically inconsistent position.)
> I wait to see what will be next on the chopping block.
I wish we didn't have to wait. I wish there was a large enough group of people that would boycott companies that make up the internet infrastructure (e.g. ISPs, registrars, DNS, CDNs, cloud providers, etc) for censoring lawful content. I wish we didn't require mass media articles to get groups of consumers to act in a common interest. And I would prefer it was absolute, regardless of content, because it's frustrating to watch lots of people agree with one decision and then get shocked when these companies become more choosey on the foundation of acceptability that was previously built. To be clear though, I want no government involvement in any way (on the censoring or non-censoring side).
I for one will point an Azure sales rep (or hiring recruiter) to this as a reason I will not engage with them. I have the same feeling about CloudFlare's decision. And I really don't like either of the censored groups.
There was no government involvement in this. Microsoft made a business decision that hosting Gab would subject them to business pressures from other customers/potential customers not wanting to use a service used by hatemongers.
That's not censorship. That's the free market at work. Gab is free to host their own website; and if necessary their own DNS servers. Hell, they're free to go Tor-only.
If there is a sufficiently large group of willing listeners (or determined trolls), then having a bad idea out there in reach of that audience legitimizes it far faster than private discussion could. At which point, challenging them invites the creation of a cult rather than dispersing the idea.
For a frivolous example, my roommate is now convinced that the Star Wars prequels are legitimately good movies because of r/PrequelMemes.
There are some lines you can't cross. I doubt any philosophers would truly argue that free speech is absolute, and that yelling FIRE in a crowded theater should be protected because it is possible to ridicule the speaker and calm the audience before they stampede each other to death. Malicious intent and incitement of violence are not mere speech.
A common argument in philosophy is that bad ideas should occur in the open - where everyone may ridicule them. When they are cast to the shadows they can grow, under everyone's noses, in private where only there are they allowed to exist. In private, these bad ideas cannot be challenged by others and people will be convinced to believe in them, with nobody challenging the idea as a genuinely terrible idea.
I used to subscribe to this position, but now consider it wrong for several reasons.
First, ridicule of a bad idea will have that effect anyway - people who like the idea but not the ridicule will form and gather in niche communities to incubate whatever weird ideas they hold. This is not a bad thing, as such, and can often be productive of new cultural or scientific ideas.
Second, everyone is not a rational philosopher. Some people are easily tricked by fallacious arguments, and other people are willing to treat a threshold level of social proof as equivalent to logical validation. Hence the wide currency of 'skeptical' responses to climate change, the dangers of smoking, or evolution - for a suitable fee, you can find some people who are willing to rent out their reputation or credentials and you can then manufacture the appearance of scientific controversy.
Third, proponents of bad ideas tend to make bad faith arguments, so challenging a bad idea is treated as equivalent to trying to suppress it anyway. Of course, such weak arguments don't hold up very well in philosophy journals or social fora, but that's because people in those contexts generally have academic degrees or equivalent study and there are fairly well established standards of discourse. If someone consistently makes fallacious arguments or recycles debunked ones eventually they will be ignored. In broader social fora those institutional safeguards are absent, while nontrivial socioeconomic payoffs for the production of bad faith arguments are present - in other words, you can make a good living out of pretending to believe the earth is flat because there's a market for comforting lies that help to relieve people's existential anxieties, and the cost of repeating the false ideas is relatively low compared to the cost of debunking them over and over.
Also, you keep making non-sequitur arguments, going from claiming that bad ideas will fester unnoticed in private spaces to slippery-slope 'you're next' arguments within the same paragraph. The latter does not follow from the former, and in fact implies that it is not possible to distinguish between good and bad ideas and that if bad ones are driven to the margins good ones will suffer the same fate despite their merits.
I'd love to see an analysis of if this theory is true. There's another argument for allowing hateful speech which is that it is a containment strategy. It appears that there isn't much to that theory. An analysis of reddit's 2015 banning on r/fatpeoplehate and r/coontown did not see the hateful speech increase where 'migrants' went.
That isn't necessary supported by evidence nor is it the most sophisticated view in contemporary philosophy.
Good evidence can't be trusted to cause change in mind, see for example the classic paper "Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence (https://www.unc.edu/~fbaum/teaching/articles/jpsp-1979-Lord-...).
It’s certainly possible the people working at Microsoft believe they have a moral duty to keep hate speech like that off their platform. I know that if I worked at a platform company and there were folks advocating physical harm to anyone on that platform I’d been pretty keen to remove them.
I don’t think free speech means a platform company has some imperative to offer a level playing field to racists.
> A common argument in philosophy is that bad ideas should occur in the open - where everyone may ridicule them
Problem is that on internet there is no open commons where the bad ideas can occur; instead they occur generally always on someones service on some level. This leads to the conflicting freedoms of the service providers and their clients.
> When they are cast to the shadows they can grow,
It's a nice poetic conceit to think of good ideas as thriving in the light like flowers while bad ideas flourish in the darkness like mushrooms, but they both thrive on the same thing -- attention. To make another comparison with things that grow in the shadows: the crawlspace under my front porch is full of bugs and critters, but they didn't get there because I cast them out of my house, and I don't keep my house bug-free by periodically inviting them in for debates.
Not much is actually getting censored. The bad stuff just gets moved to less respectable sites. The Stormer for example is still up there if you want to read it.
Is this an argument in favor of the Streisand effect? After all, most of us wouldn't have known about these posts if Microsoft hadn't asked them to deleted.
Or is it an argument that there is not enough social shaming on the Internet? Since that's what we're likely to get instead of reasoned debate.
Maybe the old arguments in favor of debating bad ideas need to rethought to account there are so many people posting them, and they don't necessarily deserve attention. It's not 1995 anymore; we know that attention is valuable, and it's important to have efficient defenses against denial-of-service attacks due to spam, trolling, viral post-sharing, and so on. Debating every post of the same old outrage isn't efficient.
For bad ideas that aren't new, something like Snopes makes a bit more sense. And I'll point to the Ask Historians subreddit for a good example of how to run a high-quality Internet forum.
These aren't merely "bad ideas". It's hate speech advocating for violence against a historically oppressed group. There is no slippery slope; we know what hate speech and anti-semitism are, and those are the rules being applied here.
A common argument in philosophy is that bad ideas should
occur in the open - where everyone may ridicule them.
Unless people with those bad ideas find communities that accept those bad ideas. This is what happens with content bubbles and niche extremist communities.
"In one post, Little said he would livestream himself destroying an unspecified Holocaust memorial in the U.S."
Ugh. Did you even bother to read the article? This has absolutely nothing to do with "bad ideas." MSFT asked Gab to remove content that "incites violence." MSFT is not cracking down on anyone because they have some disagreement with their public policy positions or political philosophy.
Yelling "fire" in a crowded theater is not an "bad idea" it's deliberately putting people in immediate danger. Likewise advocating that we physically harm people or commit acts of terrorism against specific locations are not "bad ideas," they're advocating specific actions that will physically harm other people.
Obviously, there's no censorship of edgy and extreme ideas by MSFT, if MSFT wanted to do that they would not be hosting Gab to begin with.
Instead, all they're doing is cracking down on content which incites violence, which is in violation of their terms of service, and is absolutely not protected by the First Amendment.
Finally, can you cite any evidence from the psychological literature showing that shining light on bad ideas changes anyone's mind? Just about everything I've read has suggested that's not the case.
Inciting violence is not free speech, just like yelling "fire!" in a crowded theater is not free speech.
The article clearly states that the removed comments incite violence:
> "Microsoft received a complaint about specific posts on Gab.ai that advocate ‘ritual death by torture’ and the ‘complete eradication’ of all Jews. After an initial review, we have concluded that this content incites violence, is not protected by the First Amendment, and violates Microsoft Azure’s acceptable use policy," Microsoft said in a statement to The Hill.
Are you suggesting that this type of language should be protected by free speech laws?
Not sure how old is that common argument, but they weren't aware of how it was internet social media in the last 5-10 years. Specially counting the well funded campaigns promoting things like climate change denialism, among others.
Big hammer censoring is not the solution, as it enables censoring the truth too (that may threaten, incomodate, etc the one handling the big hammer), but sometimes respecting the freedom of the ones that don't respect the same right for others is not a wise policy.
Problem is combining ignorance with confirmation bias and then multipling it by distribution engines like FB, Youtube, Podcast etc....
unfortunately information is NOT absolute. it's asymmetrical and heavily influenced by our limited world view, ignorance and confirmation biases.... allowing bad agents to take advantage of this fact and then allowing them to use the massive distribution channels like FB etc. is recipe for unrest.
Has Microsoft identified similar posts on Gab in the past, and requested/obtained their removal -- or else? (is Gab acting as a sudden victim?)
If not, how is that possible -- since Gab was created to explicitly tolerate pretty much any kind of barely-legal speech under pressure in Twitter? are we to believe that this style of expression is completely new to Gab? (or did Microsoft just now notice?)
>A common argument in philosophy is that bad ideas should occur in the open
Yes we all know about that theory, but what we haven seen happening in the last 10 years as the internet has become populated by the masses seems to suggest that theory is very wrong.
On a more pragmatic note, it's much harder to track these groups as well on the dark web. On the internet, anyone can see their thoughts and it's much easier to track their location as well.
Another argument against prohibiting speech on some platform is that it will push people platforms without any speech restriction, radicalasing their view.
I'm shocked by how many liberals who really ought to know better are cheering for the deployment of corporate oligopolistic censorship at this scale. We're not talking about just kicking someone off Twitter... now we're talking about kicking them off supposedly neutral public cloud platforms.
This is a major escalation and honestly it's changing my mind a bit about the whole issue. I was a fence sitter before but now I'm siding with the libertarian crowd on this one, and I say this as someone who is nowhere near the alt-right politically.
We're going from what amounts to forum policing (though on huge scale quasi-public forums) to infrastructure level censorship. No longer can you just "go set up your own site." Even more significantly we are seeing an incredible amount of coordination between corporations on this, proving that competition and diversity in the marketplace is not sufficient to protect the openness of the Internet as a system.
Once again: liberals really should know better, especially any that care about net neutrality. This is a really extreme example of ISP traffic discrimination and it sets a precedent that this kind of thing is okay. A top level infrastructure provider blackballing a site for speech should be a third rail regardless of the content in question.
Censors always start with the least popular ideas and speakers. They do that for a reason: they want to gauge what they can get away with and they want to shift the Overton window toward increasing support for censorship. They know that few people will overtly stand up to defend Nazis, trolls, and blithering red faced demagogues, so that's where they start. Other popular targets in the past included pornographers, lewd writers and musicians, and religious blasphemers.
Two thoughts that ought to keep you up at night even if you are inclined to agree with these moves:
(1) What happens if/when the political winds shift and someone like Donald Trump or maybe someone even to the right of him ends up in control of these powers? Who gets silenced then? A power once created becomes a political entity that can easily be handed off.
(2) What happens if/when some liberal cause -- like a reboot of unionization and a new labor movement for example -- really threatens corporate and Wall St. profits at a large scale? Will Neo-unionists get this deplatformed and shadow banned now that it's been legitimized? Will anyone even notice?
Again... we are setting a precedent here that this is okay. You really all ought to know better. The reference metric for freedom of speech is the freedom enjoyed by the most offensive and least popular speakers: political fanatics, pornographers, demagogues, hate-mongers, etc. If they are free to speak then you are free to speak. If they're not then the process of clamp-down has started and you are next.
There's a reason the ACLU has in the past defended neo-Nazis and other unpopular speakers, and it's not because they support these speakers' messages. It's because you need a canary in the coal mine.
Edit:
The conspiracy nut devil that hangs out on my shoulder whispers that this combined with the abolition of net neutrality is a coordinated campaign. Use massively unpopular triggering demagogues to get the left to abandon its commitment to free speech while simultaneously using fallacious libertarian arguments to get the right to eliminate net neutrality. Put those together and you have a Great Firewall of America and enough public support from a broad enough subset of Americans to deploy it.
(The libertarian arguments against NN are fallacious because ISPs are government backed and sometimes even funded monopolies.)
The next ratcheting up would be for peering points to refuse to peer Internet traffic that they don't like, and abolition of net neutrality would allow that. That means Gab (or tomorrow moveon.org, who knows) couldn't even host at an indie hosting provider or overseas.
But that's crazy talk right?
If sites like Gab get banned from being hosted the next step for them will be to move to decentralized platforms or Tor. If the above is true I predict that this combined with the words or deeds of a few nutjobs will be used to start convincing both liberals and libertarian-minded conservatives to start supporting bans on un-escrowed encryption as well as ISP and cloud provider efforts to block "horizontal" network traffic. This is how you'll be convinced to support a ban on P2P protocols and privacy technologies.
> Censors always start with the least popular ideas and speakers. They do that for a reason: they want to gauge what they can get away with and they want to shift the Overton window toward increasing support for censorship.
Every time a neo-Nazi gets shut down, people trot out the "First they came for the communists…" — and yet the next time, it's a neo-Nazi again, every time. It would be bad if people who aren't promoting harm start getting shut down, but I don't see any evidence that current events will lead to that.
> (1) What happens if/when the political winds shift and someone like Donald Trump or maybe someone even to the right of him ends up in control of these powers? Who gets silenced then? A power once created becomes a political entity that can easily be handed off.
We're not talking about the creation of a new power here. This is a power that already exists, and bad actors' ability to use it isn't based on whether or not it's used for good.
> What happens if/when some liberal cause -- like a reboot of unionization and a new labor movement for example -- really threatens corporate and Wall St. profits at a large scale? Will Neo-unionists get this deplatformed and shadow banned now that it's been legitimized? Will anyone even notice?
Again, there is no causal line between "Neo-Nazis get deplatformed" and "Corporations deplatform their critics." You can have either one without the other.
In general, your fears seem to be based on an assumption that looks something like "Bad people won't do anything to good people that good people don't do to them." But history has repeatedly shown that this is not the case. Our refusal to take the threat of Nazism seriously the first time didn't in any way constrain the things they did to people.
>There's a reason the ACLU has in the past defended neo-Nazis and other unpopular speakers, and it's not because they support these speakers' messages. It's because you need a canary in the coal mine.
People aren't mentioning this, but this has already happened.
I'm not white, just to be clear, however back sometime last year I knew infrastructure-level censorship was coming when DNS services, GoDaddy in particular, started censoring sites. The canary in the coalmine was the Daily Stormer.
It's one thing to want them to be quiet. It's another thing to set the precedent of infrastructure forcing them to be quiet. As you said, it sets a precedent.
EDIT: DNS censorship has indeed been mentioned in this thread.
>libertarian crowd on this one, and I say this as someone who is nowhere near the alt-right politically.
This is one thing that concerns me is people who stand up for this or lean libertarian being seen as alt-right simply because they disagree with this corporate censorship. To me this censorship or suppression goes against the founding principles of the internet. I don't understand why it is acceptable to label everyone as such.
Censorship of legal speech isn't the answer, ever. I can't wait for SV to find it's compass again and return to the love of free speech. That said, I do think we need a solution to have our open platforms where people can avoid certain things they find toxic. I don't want to read ideas from certain people, so letting the end user decide is the best solution.
> A common argument in philosophy is that bad ideas should occur in the open - where everyone may ridicule them.
Also, so that good ideas can have their day in the sun. Lets not forget that most "good" ideas were once considered bad ideas.
List of ideas that was once considered bad.
Abolitionism. LGBT rights. Heliocentrism. Germ theory. Endless list.
The idea is that all ideas deserve discussion so that the good rises and the bad sinks.
Also, by allowing censorship when you have likeminded people in power, you also allow other people to censor in the future when power shifts.
This is just basic stuff you learn in philosophy 101. Unfortunately, we don't seem to be teaching basic principles in high school/college anymore.
People naively think that censorship keeps the bad ideas away. Most of the time, it is used to keep the good ideas away. Look at the censorship by the catholic church, soviet union, china, nazi germany, etc.
If your ideas have solid footing, you would be against censorship. It's when you can't defend your ideas that people advocate for censorship.
There is a big difference between ideas like LGBT/PoC/Women rights, abolitionism, ... and White Supremacy, Racism, Religious Fundamentalism, ... : One advocates for inclusion, social peace and acceptance, the other for exclusion, hate and race/religion war.
This is not something we can ignore and treat every ideas like they are the same.
In Europe, it's very common to have an exception to Free Speech when it comes to hate speech, and I never saw it being used in the "wrong" way and keep good idea away.
>After an initial review, we have concluded that this content incites violence, is not protected by the First Amendment, and violates Microsoft Azure’s acceptable use policy
If the posts were illegal, why isn't the poster being arrested? Where are the real police in this matter? Why are California cops deferring action to the Redmond internet police?
This sort of thing illustrates the biggest issue with web hosting, domain registrars and the internet's infrastructure in general; everything is privately owned by companies who get unlimited rights to decide who and what they want to host.
The solution is to require all hosting and domain companies to act as utilities, and require neutrality in regards to any content that's legal. They're not private forums or homes, they're the internet equivalent to the electricity company or the water company. They're the internet equivalent to a phone service provider or ISP.
Why do/did ISPs have to deal with net neutrality while large hosting companies and registrars get free reign? The power company can't cut off your service because you offended someone online or what not.
If you want to argue the internet is a place for free speech, then it needs places you can host content/register domains/whatever that act like a public square, not a shopping centre.
The phone company would be a better analogy, and they absolutely can cut off your service if you’re being a nuisance. The first ammendment says your government won’t pass laws against you saying stuff, it stops you from going to jail for the things you say, and it’s there to protect the people’s right to criticise the government.
It is not there to force companies to serve you if they don’t want to.
Okay, so sticking with that analogy, can your phone company cut off your service if you're making racist comments or discussing unpopular political ideas with your friends over the phone? Or, perhaps more importantly, _should_ they be able to?
> The solution is to require all hosting and domain companies to act as utilities, and require neutrality in regards to any content that's legal. They're not private forums or homes, they're the internet equivalent to the electricity company or the water company. They're the internet equivalent to a phone service provider or ISP.
Exactly this. When the First Amendment was authored, I don't think it was even conceivable that any organization besides a government or state church could effectively censor speech at scale. Now that we have "private" organizations that effectively have that power, it's perhaps time that the law be updated to reflect that change in facts and preserve people's right to free expression.
When the First Amendment was authored, I suspect people thought public property was going to be where the largest audience was, with the town hall and park and square and what not being the main platform for discussion. Reaching the crowd meant going out with a handbell or standing on a box passing out pamphlets. Stopping the government from arresting someone advertising in the town square was more important than stopping a shop from kicking out a customer they didn't like.
But that's not the case now. The platforms hosting your work are privately owned, the large community websites/social networks/whatever with millions or billions of users are privately owned, and a larger portion of real life locations people frequent are privately owned too.
The way society is going seems to be making the amendment less and less relevant as time goes on.
> Why do/did ISPs have to deal with net neutrality while large hosting companies and registrars get free reign?
Because ISPs are, because of infrastructure (access to and across property, or access to scarce spectrum) requirements, naturally sharply limited, but anyone who connects to an ISP can host content.
Anyone with a server can host content in a technical sense, but can it ever scale without playing ideologically with the big players?
One tactic for deplatforming undesirable sites is a DDoS -- if you put your home server online, what options do you have for DDoS mitigation if the big players in that space decide they're on the political side of those doing the DDoS?
Are people still able to download a web server and host their own content? What does anyone need a host for? If a bunch of angry racists want to start their own social network, they could build it, if they really wanted a place to spew their hate. Is anyone really stopping them from connecting to the internet?
There are a lot of people here completely in support of this, and it's not hard to trace an IP address to an ISP. When brigades start calling for ISPs to stop serving these sites, do you really think that's when they'll stand up for free speech?
Keep in mind in the span of just a few days, we've seen the jump from application platforms (Facebook, Google) to now infrastructure providers. The leap to ISPs is not that far, and something tells me when it happens, supporters will simply move the bar again.
Could they? Couldn't an ISP decide they don't want to serve that hateful content over their pipes and disconnect them? Or what if the domain name registrar decides they don't want to do business with an angry racist? Or what if an upstream backbone provider decides they could get some good PR by blackholing their IP?
These days, private companies have a _lot_ of power over what speech is allowed to be heard. Perhaps even more so than the government.
The solution is not to create more laws which will give the real danger (governments) more power to use their violence.
The solution is to simply have everyone host themselves. Even a moderate home broadband connection is more than enough to serve up your personal website.
Once you start relying on third parties you've lost. That's their property, not yours. When you claim it is yours and use government violence to enforce your wishes you are doing more damage to society than what you're trying to stop.
It’s impossible to avoid relying on third parties. Somebody has to lease you the DNS name; we’ve already seen that used to punish sites. Somebody has to host your server or sell you an internet connection. Moving to Tor is the only way to protect yourself from those means of punishment, at which point you’ve lost 99.99% of any audience you might’ve had. You might as well just go to a physical newsletter.
Your ISP is also a private third party. If Microsoft rejects you why wouldn't Comcast? The 70 dollars you pay for a home connection isn't worth dealing with a twitter mob, so they'll gladly boot you.
Hosting companies never had to, because net neutrality is only about the Internet itself, not the edge services that host websites and whatnot.
One bit of hilarious irony is that the only reason people in the US can access the Daily Stormer is net neutrality (now just de facto), yet the Daily Stormer was full of people calling for the end of net neutrality.
One thing that hasn't yet been remarked on here is that the 48-hour-warning message said that the linked posts were flagged for having phishing links. Notice that the statements made afterward by MS were not on the grounds of the original "phishing links" report, but focusing on the content.
However, if you look at the posts, you'll see a fair amount of unsavory content but no phishing links.
I feel like this was purposeful on the part of whatever political group made these reports.
Most likely, these hosting companies have automated anti-spam/phishing systems whereby, if a large number of reports come in, they will automatically send out an alert to the suspected offending party.
If Outraged Group X goes to the company saying "this site you're hosting has vile speech and so you should take it down," it probably has to go through an internal company process, which might have a slightly higher bar for acting on it.
However, if Outraged Group X falsely claims that "this site is hosting phishing links", something which might trigger an algorithmic response, then the company cannot easily reverse course -- because then the story becomes "Company ABC actively reverses course on allowing hate speech." The PR fallout locks them in.
It's disconcerting to see how many tech and free speech people are clamoring for the censorship of dissenting opinion and are jubilant when it happens.
Free speech has always been a very counter-intuitive process. Opinions that directly oppose yours or even your very person will rub you the wrong way, and to be intent on defending the right of your opponent to voice his opinion seems paradoxical, but we nevertheless conceded that this serves the greater good.
The recent Alex Jones incident was legitimized on the basis of dissemination of dangerous ideas. With Gab, the argument is that it is hate speech. The most compelling argument for free speech is that it removes the subjective and variable limits of speech and by extension, slippery slopes.
If you must know, while I do consider myself a political conservative, I cannot bear to listen to Alex Jones and I've never used Gab. I'm radically free speech which means I enjoy living in a world where literally everyone can say quite literally everything. Whether there is a slippery slope angle to all this is not even relevant for me. I want to hear the ideas of everyone who is broadcasting.
The very arguments I am expounding on are already controversial, but they weren't very long ago. Makes you think.
The free speech absolutists would have been more convincing had they been as up-in-arms about the banning of ISIS or ISIS-affiliated accounts over the last several years. This happened by the hundreds-of-thousands.
For better or worse foreign and domestic terrorist speech is treated separately. If ISIS was a domestic terrorist organization afaik they would have 1A rights.
The right to free speech shouldn't supersede all other fundamental human rights. Why should we afford modern day nazis a platform to call for the extermination of jews, when they so obviously do not respect the right to life or freedom of the people they hate? How can you engage in rational discussion with someone who does not believe you have a right to live?
The most compelling argument for free speech is that it removes the subjective and variable limits of speech and by extension, slippery slopes.
[...] I'm radically free speech which means I enjoy living in a world where literally everyone can say quite literally everything. Whether there is a slippery slope angle to all this is not even relevant for me.
Perhaps you'd like to explain the fundamental difference between the outcomes that lie at the bottom of these variably tolerable slopes.
The posts in question are frankly pretty awful, but this does kind of put the lie to the idea that if you don't like FaceGoogleBook's censorship you can just set up your own website.
Well you can host your own website on your own server.
Of course, I do wonder how far this goes. What happens if the DNS server refuses to host the IP lookup? Host your own DNS server? What if the browsers refuse to allow access to the site? Build your own browser? What if ISPs refuse to transfer the data over the wire? Make your own internet?
People have defended that this vs. Net Neutrality (at least the extent of the ability of an ISP to control what goes over its wires) as having a core difference, but it feels to me they are closer than people realize and the standards set by one can influence the other.
> Well you can host your own website on your own server.
Sure, then the people who drove you off Facebook/Amazon/Apple iTunes/Youtube/Azure/EC2/Digitalocean/Cloudflare will go after your DNS registrar and the ISP/CDN/colocation-facility you're using. They won't give up and they'll ruthlessly go after every single commercial entity you do business with. After all, if it's not the government punishing you for speech, it's not censorship and it's fine!
That's the trick of this kind of lawfare -- having any kind of internet presence inherently involves relationships with private, non-governmental entities, and it's disturbingly easy to suborn them and shame them into refusing to do business with unsavoury people.
And like for everything else, first a precedent is created with an indefensible case of terrorism/pedophilia/neo-nazism, then the rule gets progressively applied more broadly.
> What happens if the DNS server refuses to host the IP lookup?
No need. It's easier if DNS registrars conspire to confiscate and/or refuse to sell domain names. This has already been done. SSL certs can, and have been, revoked. The ISPs haven't really been involved in these types of actions since the 90s, so I'm curious what their stance is. For now, if you've been run off the face of the WWW, you still have TOR hidden services or IPFS. Who knows what will happen when the ISPs get involved.
Personally I would draw the line on requiring ISPs to carry legal IP traffic fairly ("net neutrality"), and pretty much everything else being free game for terms of services etc. I would considering extending the "neutrality" aspect for domains, considering that ICANN has effectively monopoly on DNS.
There are still some practical problems for hosting your own website with those "primitives", most importantly getting an internet connection suitable for hosting anything in the first place if nobody is willing to co-locate your servers. One would hope that with the proliferation of IPv6 and FTTx, actually hosting stuff from your basement would become more realistic.
The only thing you can really do is use tor. But you have to be very, very careful and know what you're doing. It's not something most people can set up, and not something most people can (easily) access.
Well, they kind of shot themselves in the foot by leaving big social media platforms... for a big server hosting platform. If they wanted to go Henry David Thoreau on the corporate internet, why didn't they buy physical equipment?
Having read the article but not knowing anything about the posts themselves, I think most of this thread misses the point.
Clearly Microsoft knew it was hosting Gab and what Gab was. There's a troll argument that suggests Gab is just "free speech twitter", but of course that's not the case: I've been screenshotting the front page for months, from a random anonymous account, and every time I've done it the front page was full of horrible Islamophobic, racist, and anti-Semitic crap. That and bot content.
My point here is: everybody knows that's what Gab is. Microsoft isn't pushing back on Gab's anti-Semitism --- without anti-Semitism, there is no Gab. They had specific harm reduction problems with a pair of posts, one of which, according to this article, was a call for violence directed towards Jewish people.
> learly Microsoft knew it was hosting Gab and what Gab was. There's a troll argument that suggests Gab is just "free speech twitter", but of course that's not the case: I've been screenshotting the front page for months, from a random anonymous account, and every time I've done it the front page was full of horrible Islamophobic, racist, and anti-Semitic crap. That and bot content.
This seems to accurately describe "free speech twitter". All the content that would get people banned on twitter is displaced into sites like Gab and 4chan. The few occasions I've perused these sites supported this: there were lots of hateful speech against minorities, but a few instances of hateful speech against the majority. A very narrow minority were the latter, but as far as I could tell these sites upheld their commitment to freedom of any legal speech for all and didn't censor them.
If there is a large enough group of people that's effectively banned from mainstream social media, it's entirely natural that they will be over-represented within the largest platform that doesn't ban them. This doesn't mean the platform in question is designed specifically to cater to those groups.
" Gab is absolutely not a “white nationalist social media platform.” We are a free speech social media platform. We welcome everyone and have since the day we launched. My co-founder Ekrem is a Muslim Kurd in Turkey. Our Chief Communications Officer Utsav is an Indian and a practicing Hindu. Our “frog logo” that the media wants so desperately to tie to “pepe,” was inspired by Exodus 8:2–7 and was designed by our Creative Director Brandon, who is Jewish."
That's an interesting thing for him to say, because Gab is absolutely and obviously a white nationalist social media platform.
Want another example? Arguing with Ken White (Popehat) on Twitter, the official Gab account RT'd a white nationalist mocking Ken for having adopted Asian children.
Exodus 8:2-7 alludes to an ultimatum threatening to inflict a plague of frogs on Egypt. Because when I want to demonstrate my love of freedom and inclusivity, the first thing I think of is calling down plagues on my enemies.
Just for kicks made an account and the front page currently looks like this for me;
Popular Posts
1) Thug Who Destroyed Trump’s Star and Bragged About It Gets Hit with Felony Charge, Facing Hard Time
2) Alex Jones Breaking: Son Of Terrorist Master Mind Caught Training Child Soldiers To Commit School Shootings Tune in M-F 8-11a
3) Candice Owens Tweets: BREAKING: 71 Illegal aliens were shot, trying to cross the border this morning! —Just kidding. It was actually black people in Chicago last weekend. You can go back to not giving a damn, liberals.
4) Military Support: Honoring Air Force Maj. Walter D. Gray who selflessly sacrificed his life six years ago in Afghanistan for our great Country. Please help me honor him so that he is not forgotten.
Well, I don't have an account, but this blew up sufficiently enough that I only needed to click "explore" and stumbled on a repost containing a screenshot of the original post. Here you go: https://gab.ai/PNN/posts/31263515
Wow, what a sad website. I don't quite understand the mindset/vitriol/hate/anger/etc here (the reason for it or need for it) psychologically speaking.
In any case I've taken my own simple screenshots of the repost in case it disappears, which I expect will happen eventually. If anyone absolutely needs them let me know.
How is a website sad just because it offers true free speech? I think it's sad the the perception of free speech is being tainted in such a bad light. Free speech is _only_ relevant if it covers the controversial things, too.
For what it's worth: To the user, Gab is just "free speech Twitter". It just happens that the people whose speech is being repressed say things which make you want to repress their speech. Twitter used to be "the free speech Twitter" for many of these people, and their arguments shared a platform with unhindered common speech (and, as was my personal pleasure, were rebutted by many people who were at no personal risk of being repressed).
There happen to be plenty of people who go around collecting and rebutting white (and other ethnic) separatists/nationalists on Gab, and they aren't being repressed (in fact, some are receiving tips for the service).
You've been using HN primarily for political and ideological battle. That's an abuse of the site and we ban accounts that do it, regardless of their politics. Please review https://news.ycombinator.com/newsguidelines.html and use HN as intended from now on.
"Users are prohibited from calling for the acts of violence against others, promoting or engaging in self-harm, and/or acts of cruelty, threatening language or behaviour that clearly, directly and incontrovertibly infringes on the safety of another user or individual(s)."
Why would they give Little the option to delete his posts instead of banning him and removing the posts themselves? If I was in their position, it wouldn't even be a question of free speech considering he was breaching the ToS.
The content here is quite disgusting, and I certainly do not defend it, but I really take issue with what is happening here in principle.
Overall this is similar to a runaway effect that negligence of the environment can have. At a certain point it will be too late to solve the problem i.e. control emissions or oppose this kind of authoritarianism under different circumstances. The very means to do so will either be impotent or impossible.
By then you are reliant on some new magical invention to resolve the problem, or a large death toll in the millions, which changes the entire landscape.
Corporations certainly have the right to do this, but they are only choosing to do so because there are strong winds blowing in this direction. It's important that they are opposed, before this paves the way for big mistakes to be made.
It's important that these kinds of ideas see sunlight. Ideas that promote tolerance, peace and common sense will win in the end, provided we keep talking. If talking comes to an end, we know what comes next. These ideas will only find more room to grow, and see greater validation when they are opposed in this kind of authoritarian manner.
Rather than advocating a strategy where we are relying on governments and corporations to solve these problems, we should be the ones fixing our societies and arguing against disgusting ideas such as this, with well reasoned arguments, or even humor, both of which have historically been great at winning ideological battles.
I am sure this crowd knows their Aaron Satie.
"With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably."
"When our enemies say: But we used to grant you freedom of opinion -- yes, you granted it to us, that is no proof that we should do the same for you! That you gave us that just proves how stupid you are!"
So take Goebbels' advice? Not sure what you are getting at.
Our enemies are fascists who will take away our rights, therefore anyone suspected of eventually becoming a fascist will have their rights preemptively taken away so that it does not come to that?
It gives bad ideas a breeding ground to foster and you'd never even know about it. By publicly prohibiting speech all that happens is it is brushed under a rug and people pretend it doesn't exist. That doesn't make the problem go away, it's the same as shutting your eyes and covering your ears and pretending the monster in front of you no longer exists. If you get to define certain speech as unspeakable you justify the censorship of any speech determined to be unspeakable when you lose power and someone else steps up to the plate. As history has shown - it's a matter of "when" not "if".
Actions like this serve to placate the public. The general public won't be angry at Microsoft for censoring platforms "known for" or "complacent of" hate speech (or worse). It's always a slippery slope though. Once you open up the can of worms that is censorship you justify future censorship.
I wait to see what will be next on the chopping block.
Edit: A few too many responses to respond to everyone, but if anyone would like to speak with me on it more in depth feel free to email me - information on how to contact me can be found on my profile.
When you have a small group with social bonds, you don't have to worry about a bad actor constantly repeating their already-refuted arguments to people who haven't heard the refutation.
But when you have the whole internet as your audience, no matter how many times you are refuted, you can say it again and a whole group of people will be hearing it for the first time.
I have been on the Internet since the 80s. What I have personally found is that a lot of its social norms were lifted from academia, and they work very well as defaults in a new social media context. Like Hacker News when it was first launched.
But eventually, every "social medium" on the Internet has its "Eternal September," when the number of new folks in any conversation outweighs the old hands. At that point, the social norms that worked for the small, cohesive group no longer work.
And worse, there are specialist parasites who exploit people's reluctance to change their social norms, and the act in outrageously bad faith.
Sites like HN have survived by changing. For all of the popularity on HN of "unrestricted free speech," HN is actually moderated, and that's why it works.
If mere exposure to bad ideas is this dangerous then mass scale democracy is not viable. The viability of democracy beyond very small groups is dependent on the idea that human beings are capable of independent rational thought and of discussing and evaluating ideas on merit. If that's not the case then the neo-reactionaries are right and democracy is doomed to failure or worse.
What I see here is a major freakout over Trump. IMHO feeling alarmed about Trump is justified, but throwing basic freedoms and the idea of the open flow of information out the window is not the right response. The right response is to ask how and why Trump won and address the problems, ideas, and arguments that led to it rationally. The solution is also to learn how to use this new medium better. If Jones' BS is more "viral," then more rational thinkers need to learn how to play the viral game. You must learn to communicate effectively using the media of the time. If you use social media you must learn to be viral just like previous generations had to study the art of the persuasive essay or how to do makeup properly so you don't look like a sweaty pig on a TV interview (see Nixon vs. Kennedy TV debates).
If it really is true that bad ideas are inherently more viral than good ones and that cannot be combatted, I do not see how democracy can survive at all in the Information Age.
Why is that a problem? You can't force people to believe in reality. Might as well have this process happen in the open to actually provide a disincentive to believing conspiracy theories/propaganda/ads/fake news/whatever.
Hell, I'll go as far to say that controversial/commonly mis believed concepts are the most worth discussing in a public forum. For instance, consider white supremacy: if you give people a choice between social groups and make them feel welcome, they can use simple, self-serving logic to join the one that meets their needs. White supremacy doesn't help white people, rationally, it's just stupid, except for very convoluted goals. See: https://www.nytimes.com/2017/08/22/podcasts/the-daily-transc...
If you don't enable conversations between white supremacists and anti-racists, there is no real choice--they'll gravitate the group where they're accepted.
I enjoy lots moderated platforms. But I still think it is important that uncensored platforms like gab exist at all.
Dead Comment
Deleted Comment
Not to get too far off topic, but is this actually true in practice in the real world? Scientific studies have shown that the idea that gets repeated the most often is the idea that "wins". A bad idea gets repeated and spread every time it gets criticized. If the idea proves more "credible" than the criticism of it, then all that criticism does is contribute to the spread of the idea it's trying to kill.
But by 4th grade I'd be teaching about slavery, racism, etc, and the kids would be then learning about a concept that was utterly nonsensical to the non-racism-indoctrinated kids. So I'm wondering what would happen if I never taught about racism at all.
There is also the idea that a lie travels around the globe while the truth is putting on its shoes, which spurred /r/AskHistorians to ban holocaust denial[2].
[0] https://en.wikipedia.org/wiki/Rebound_effect [1] https://www.skepticalscience.com/docs/Debunking_Handbook.pdf [2] https://slate.com/technology/2018/07/the-askhistorians-subre...
For example, QAnon believers have not had their speech restricted. That conspiracy theory is as open as anything you could name. People have challenged it. It's still going.
If you believe that challenging terrible ideas makes those who hold them think twice... why hasn't the exact argument you're using gotten rid of the terrible idea that we should restrict speech? Answer: because logical refutation doesn't work.
I am not arguing against the slippery slope. This is a very, very hard and risky problem. I am refuting the misconception that shedding light on bad ideas makes them go away.
Moon landing conspiracy
Anti-vaccers/Vaccines cause autism
Fad diets and "detox"s
Anti-intellectualism
Alex Jones (to the tune of millions of subscribers on his recently killed youtube)
Global Warming
etc
It can work, if-and-only-if if it is done after real communication has been established. Most people trying to refute "obviously wrong" ideas skip the first step and start with arguing in a language and framing that they understand. This is noise when the other person uses a fundamentally different framework for what counts as "facts", "evidence", "authority", etc.
One of the better explanations of what I'm talking about is this[1] essay. While it discusses the creationist "debate", the ideas about trying to educate people that are resistant to science and logic apply generally.
Yes, logical refutation rarely works. However, did you bother to learn their language so there is a foundation of actual communication to build upon? Or did you argue in a language they don't understand? Even worse, did your argument appear to contain a lot of phatic expression that is hostile from their point of view?
[1] http://scienceblogs.com/clock/2007/05/31/more-than-just-resi... (see the heading "Hierarchical View of the World" and "The Problem of Language" for the main argument)
It doesn't make them go away. It allows them to be refuted - which hopefully stems at least some of the indoctrination that would otherwise occur only in private.
If everyone I've ever known tells me vaccines are evil and I never am told or shown otherwise - I'll grow up continuing to think vaccines are evil. If you refuse to let me speak about vaccines that only further proves that you're evil and trying to hide the truth - and strengthens my indoctrination. Now, poor arguments against anti-vaccers also serve to strength the indoctrination so there's some bad with the good... (Also just to be certain on things: "you" and "I" are used generically for easier writing, I support and believe in vaccines.)
There are also plenty of bad ideas that do die to public scrutiny - think of anytime anyone has ever been talked out of "doing something stupid" because they shared what they had planned to do.
Say we remove Qanon from all platforms of reputable note. Do you believe 4chan would ever comply?
Secondly, what impact does that have on the reputable platforms vs 4chan?
How does that all break down for people?
My thoughts are:
1. 4chan will continue very liberal speech. People will continue to visit, and those numbers will increase due to the Streisand effect.
2. The number one and two responses to Alex Jones recent and wide spread ban are:
Who is next?
FINALLY!
And those trend along authoritarian lines most clearly. Anti authoritarians are more critical and make it clear they do not need to be protected from bad speech.
Authoritarians make it clear they value said protection because of what could happen.
I bet the same happens with QAnon, should that decision get made.
Net outcome:
Polarization along authoritarian lines.
Frankly, I think Jones is laughable. I also think specific speech he made should be removed because it is defamatory and or inciting in nature.
I think a blanket, "fuck this guy" is a grave error.
I think that because:
The answer to bad speech is more speech, and post removal is a form of speech, given it is used where speech is actually criminal. And it is understandable too.
Bans are not those things and can create martyrs as well as augment attention to other platforms where fans, supporters will quickly gather to discuss injustice.
I think the laughable should be laughed at too.
Civility is difficult here. Forcing that, as in nobody gets offended, dilutes speech down, and prevents "more speech" from working as it should.
That all forms a resonant cycle we probably will not like very much.
At the root of all this is forced trust and an abandonment of safe harbor. That will not be easy to undo, and it will divide people into those authoritarian vs non authoritarian camps too.
I gotta be honest and will gladly side with the anti authoritarians.
Do not need the protection, and I have a very, very thick skin. There are few ways to actually impact me, and I do understand my agency in conversation well.
I can't control others, but I can control me. Bad speech from clowns and asses is entertainment. It isnt something I give weight to. Nobody should.
That they do, and that they are enabled with this idea of protection being somehow needed is likely to do more harm than good.
Anyway, the Qanon types will find and communicate where the speech is, just as the Alex Jones types will. And then they will share that and martyr status should prove effective in rebuilding the lines of communication, only now we have doubt.
Who is next and why?
How do people know the difference between corruption and abuse of power / money from a genuine act to manage criminal speech properly?
What of things like profanity?
More basically, how do people understand they are having a real conversation as opposed to just a permitted, even encouraged one?
I want the real conversation and will seek that. Got no time for Disneyland type conversation.
Take the example of QAnon. It's a movement based around utter nonsense conspiracy theory. It is debunked on a daily basis. It still has a lot of followers.
Compare to Milo Yiannopoulos. I remember around the time his social media accounts were banned that people said doing so was pointless, and that banning him would only increase his popularity by making him look like an outlaw. Yet his profile has sunk almost entirely.
IMHO a certain amount of this calling for censorship is people from those self-same institutions refusing to look at their own role in undermining the trust basis of our society. How many mainstream journalists bothered to check on those Iraq WMD claims or investigate the realities behind the 2008 financial collapse? How many politicians?
Trust is hard to earn and easy to squander. When it is lost it leaves a vacuum. If nobody is trusted the vacuum gets filled by whatever random nonsense sounds superficially good to someone.
To me the popularity of crazy populist demagogues is entirely explainable as a response to several large scale and very notable cases where our institutions have burned their credibility. If something isn't done to correct this and start re-earning trust I predict that this situation will only worsen and we'll be facing full-blown fascism shortly.
Censorship does not re-earn trust. In fact it does the opposite. Right now hordes of people out there are saying "see! Alex Jones must be onto something or why would they go to the trouble of banning him?!?" Microsoft's threat to ban Gab is the kind of PR money can't buy. I bet their traffic numbers will skyrocket.
I honestly don't know where we go from here - I continue to believe that giving the state the right to determine what speech is allowed and what's not to be far more dangerous than the alternative, and that's not likely to change. But it's almost as if we've discovered that idiocy is contagious, and the vector is the internet. Maybe it's temporary, because the internet is new - maybe society, having never dealt with a deluge of information like this before has no natural immunity to it, and it will develop over time. Or maybe we've found a loophole in human cognition that will end up destroying us.
At any rate, I can't fault a private company for trying to do something about it; it's like a bus company finding out their seats are the vector for some horrible disfiguring viral disease and removing them in an attempt to stop the epidemic.
Aristotle observed this in the 4th century BC. He wrote about it in his book Rhetoric. This is by no means a new development.
The common spaces have to be policed. People with bad intent have to be dealt with not left for everyone to deal with on their own. It is entirely possible to destabilize a society sufficiently for the social contract to break down simply by allowing bad actors to intentionally poison the social commons. You do not want to see that outcome if you are a non-sociopath human.
(Those entertained by intellectual diversions will observe that if you assume that I am incorrect and that in fact the evidence supports OP, you end up with a logically inconsistent position.)
I wish we didn't have to wait. I wish there was a large enough group of people that would boycott companies that make up the internet infrastructure (e.g. ISPs, registrars, DNS, CDNs, cloud providers, etc) for censoring lawful content. I wish we didn't require mass media articles to get groups of consumers to act in a common interest. And I would prefer it was absolute, regardless of content, because it's frustrating to watch lots of people agree with one decision and then get shocked when these companies become more choosey on the foundation of acceptability that was previously built. To be clear though, I want no government involvement in any way (on the censoring or non-censoring side).
I for one will point an Azure sales rep (or hiring recruiter) to this as a reason I will not engage with them. I have the same feeling about CloudFlare's decision. And I really don't like either of the censored groups.
That's not censorship. That's the free market at work. Gab is free to host their own website; and if necessary their own DNS servers. Hell, they're free to go Tor-only.
If there is a sufficiently large group of willing listeners (or determined trolls), then having a bad idea out there in reach of that audience legitimizes it far faster than private discussion could. At which point, challenging them invites the creation of a cult rather than dispersing the idea.
For a frivolous example, my roommate is now convinced that the Star Wars prequels are legitimately good movies because of r/PrequelMemes.
https://www.theatlantic.com/national/archive/2012/11/its-tim...
I used to subscribe to this position, but now consider it wrong for several reasons.
First, ridicule of a bad idea will have that effect anyway - people who like the idea but not the ridicule will form and gather in niche communities to incubate whatever weird ideas they hold. This is not a bad thing, as such, and can often be productive of new cultural or scientific ideas.
Second, everyone is not a rational philosopher. Some people are easily tricked by fallacious arguments, and other people are willing to treat a threshold level of social proof as equivalent to logical validation. Hence the wide currency of 'skeptical' responses to climate change, the dangers of smoking, or evolution - for a suitable fee, you can find some people who are willing to rent out their reputation or credentials and you can then manufacture the appearance of scientific controversy.
Third, proponents of bad ideas tend to make bad faith arguments, so challenging a bad idea is treated as equivalent to trying to suppress it anyway. Of course, such weak arguments don't hold up very well in philosophy journals or social fora, but that's because people in those contexts generally have academic degrees or equivalent study and there are fairly well established standards of discourse. If someone consistently makes fallacious arguments or recycles debunked ones eventually they will be ignored. In broader social fora those institutional safeguards are absent, while nontrivial socioeconomic payoffs for the production of bad faith arguments are present - in other words, you can make a good living out of pretending to believe the earth is flat because there's a market for comforting lies that help to relieve people's existential anxieties, and the cost of repeating the false ideas is relatively low compared to the cost of debunking them over and over.
Also, you keep making non-sequitur arguments, going from claiming that bad ideas will fester unnoticed in private spaces to slippery-slope 'you're next' arguments within the same paragraph. The latter does not follow from the former, and in fact implies that it is not possible to distinguish between good and bad ideas and that if bad ones are driven to the margins good ones will suffer the same fate despite their merits.
http://comp.social.gatech.edu/papers/cscw18-chand-hate.pdf
Good evidence can't be trusted to cause change in mind, see for example the classic paper "Biased Assimilation and Attitude Polarization: The Effects of Prior Theories on Subsequently Considered Evidence (https://www.unc.edu/~fbaum/teaching/articles/jpsp-1979-Lord-...).
I don’t think free speech means a platform company has some imperative to offer a level playing field to racists.
Problem is that on internet there is no open commons where the bad ideas can occur; instead they occur generally always on someones service on some level. This leads to the conflicting freedoms of the service providers and their clients.
It's a nice poetic conceit to think of good ideas as thriving in the light like flowers while bad ideas flourish in the darkness like mushrooms, but they both thrive on the same thing -- attention. To make another comparison with things that grow in the shadows: the crawlspace under my front porch is full of bugs and critters, but they didn't get there because I cast them out of my house, and I don't keep my house bug-free by periodically inviting them in for debates.
Or is it an argument that there is not enough social shaming on the Internet? Since that's what we're likely to get instead of reasoned debate.
Maybe the old arguments in favor of debating bad ideas need to rethought to account there are so many people posting them, and they don't necessarily deserve attention. It's not 1995 anymore; we know that attention is valuable, and it's important to have efficient defenses against denial-of-service attacks due to spam, trolling, viral post-sharing, and so on. Debating every post of the same old outrage isn't efficient.
For bad ideas that aren't new, something like Snopes makes a bit more sense. And I'll point to the Ask Historians subreddit for a good example of how to run a high-quality Internet forum.
Then let them occur in the open. I welcome them to the nearest soapbox.
It's not clear at all, to this student of philosophy, why neo-Nazis are entitled to time on Microsoft's servers.
Ugh. Did you even bother to read the article? This has absolutely nothing to do with "bad ideas." MSFT asked Gab to remove content that "incites violence." MSFT is not cracking down on anyone because they have some disagreement with their public policy positions or political philosophy.
Yelling "fire" in a crowded theater is not an "bad idea" it's deliberately putting people in immediate danger. Likewise advocating that we physically harm people or commit acts of terrorism against specific locations are not "bad ideas," they're advocating specific actions that will physically harm other people.
Obviously, there's no censorship of edgy and extreme ideas by MSFT, if MSFT wanted to do that they would not be hosting Gab to begin with.
Instead, all they're doing is cracking down on content which incites violence, which is in violation of their terms of service, and is absolutely not protected by the First Amendment.
Finally, can you cite any evidence from the psychological literature showing that shining light on bad ideas changes anyone's mind? Just about everything I've read has suggested that's not the case.
The article clearly states that the removed comments incite violence:
> "Microsoft received a complaint about specific posts on Gab.ai that advocate ‘ritual death by torture’ and the ‘complete eradication’ of all Jews. After an initial review, we have concluded that this content incites violence, is not protected by the First Amendment, and violates Microsoft Azure’s acceptable use policy," Microsoft said in a statement to The Hill.
Are you suggesting that this type of language should be protected by free speech laws?
So yes, according to the supreme Court, those posts are probably legal.
It would only be things like "everyone, let meet up at this specific person's house tonight and attack them" that would be illegal.
Deleted Comment
Big hammer censoring is not the solution, as it enables censoring the truth too (that may threaten, incomodate, etc the one handling the big hammer), but sometimes respecting the freedom of the ones that don't respect the same right for others is not a wise policy.
Problem is combining ignorance with confirmation bias and then multipling it by distribution engines like FB, Youtube, Podcast etc....
unfortunately information is NOT absolute. it's asymmetrical and heavily influenced by our limited world view, ignorance and confirmation biases.... allowing bad agents to take advantage of this fact and then allowing them to use the massive distribution channels like FB etc. is recipe for unrest.
If not, how is that possible -- since Gab was created to explicitly tolerate pretty much any kind of barely-legal speech under pressure in Twitter? are we to believe that this style of expression is completely new to Gab? (or did Microsoft just now notice?)
This includes corporate influence on governments, so that the politicians act as their servants in also squashing said good ideas.
Not sure what some of these ideas might be, off the top of my head right now, but I know that some have to exist.
Yes we all know about that theory, but what we haven seen happening in the last 10 years as the internet has become populated by the masses seems to suggest that theory is very wrong.
This is a major escalation and honestly it's changing my mind a bit about the whole issue. I was a fence sitter before but now I'm siding with the libertarian crowd on this one, and I say this as someone who is nowhere near the alt-right politically.
We're going from what amounts to forum policing (though on huge scale quasi-public forums) to infrastructure level censorship. No longer can you just "go set up your own site." Even more significantly we are seeing an incredible amount of coordination between corporations on this, proving that competition and diversity in the marketplace is not sufficient to protect the openness of the Internet as a system.
Once again: liberals really should know better, especially any that care about net neutrality. This is a really extreme example of ISP traffic discrimination and it sets a precedent that this kind of thing is okay. A top level infrastructure provider blackballing a site for speech should be a third rail regardless of the content in question.
Censors always start with the least popular ideas and speakers. They do that for a reason: they want to gauge what they can get away with and they want to shift the Overton window toward increasing support for censorship. They know that few people will overtly stand up to defend Nazis, trolls, and blithering red faced demagogues, so that's where they start. Other popular targets in the past included pornographers, lewd writers and musicians, and religious blasphemers.
Two thoughts that ought to keep you up at night even if you are inclined to agree with these moves:
(1) What happens if/when the political winds shift and someone like Donald Trump or maybe someone even to the right of him ends up in control of these powers? Who gets silenced then? A power once created becomes a political entity that can easily be handed off.
(2) What happens if/when some liberal cause -- like a reboot of unionization and a new labor movement for example -- really threatens corporate and Wall St. profits at a large scale? Will Neo-unionists get this deplatformed and shadow banned now that it's been legitimized? Will anyone even notice?
Again... we are setting a precedent here that this is okay. You really all ought to know better. The reference metric for freedom of speech is the freedom enjoyed by the most offensive and least popular speakers: political fanatics, pornographers, demagogues, hate-mongers, etc. If they are free to speak then you are free to speak. If they're not then the process of clamp-down has started and you are next.
There's a reason the ACLU has in the past defended neo-Nazis and other unpopular speakers, and it's not because they support these speakers' messages. It's because you need a canary in the coal mine.
Edit:
The conspiracy nut devil that hangs out on my shoulder whispers that this combined with the abolition of net neutrality is a coordinated campaign. Use massively unpopular triggering demagogues to get the left to abandon its commitment to free speech while simultaneously using fallacious libertarian arguments to get the right to eliminate net neutrality. Put those together and you have a Great Firewall of America and enough public support from a broad enough subset of Americans to deploy it.
(The libertarian arguments against NN are fallacious because ISPs are government backed and sometimes even funded monopolies.)
The next ratcheting up would be for peering points to refuse to peer Internet traffic that they don't like, and abolition of net neutrality would allow that. That means Gab (or tomorrow moveon.org, who knows) couldn't even host at an indie hosting provider or overseas.
But that's crazy talk right?
If sites like Gab get banned from being hosted the next step for them will be to move to decentralized platforms or Tor. If the above is true I predict that this combined with the words or deeds of a few nutjobs will be used to start convincing both liberals and libertarian-minded conservatives to start supporting bans on un-escrowed encryption as well as ISP and cloud provider efforts to block "horizontal" network traffic. This is how you'll be convinced to support a ban on P2P protocols and privacy technologies.
Every time a neo-Nazi gets shut down, people trot out the "First they came for the communists…" — and yet the next time, it's a neo-Nazi again, every time. It would be bad if people who aren't promoting harm start getting shut down, but I don't see any evidence that current events will lead to that.
> (1) What happens if/when the political winds shift and someone like Donald Trump or maybe someone even to the right of him ends up in control of these powers? Who gets silenced then? A power once created becomes a political entity that can easily be handed off.
We're not talking about the creation of a new power here. This is a power that already exists, and bad actors' ability to use it isn't based on whether or not it's used for good.
> What happens if/when some liberal cause -- like a reboot of unionization and a new labor movement for example -- really threatens corporate and Wall St. profits at a large scale? Will Neo-unionists get this deplatformed and shadow banned now that it's been legitimized? Will anyone even notice?
Again, there is no causal line between "Neo-Nazis get deplatformed" and "Corporations deplatform their critics." You can have either one without the other.
In general, your fears seem to be based on an assumption that looks something like "Bad people won't do anything to good people that good people don't do to them." But history has repeatedly shown that this is not the case. Our refusal to take the threat of Nazism seriously the first time didn't in any way constrain the things they did to people.
Good post.
People aren't mentioning this, but this has already happened.
I'm not white, just to be clear, however back sometime last year I knew infrastructure-level censorship was coming when DNS services, GoDaddy in particular, started censoring sites. The canary in the coalmine was the Daily Stormer.
It's one thing to want them to be quiet. It's another thing to set the precedent of infrastructure forcing them to be quiet. As you said, it sets a precedent.
EDIT: DNS censorship has indeed been mentioned in this thread.
>libertarian crowd on this one, and I say this as someone who is nowhere near the alt-right politically.
This is one thing that concerns me is people who stand up for this or lean libertarian being seen as alt-right simply because they disagree with this corporate censorship. To me this censorship or suppression goes against the founding principles of the internet. I don't understand why it is acceptable to label everyone as such.
Also, so that good ideas can have their day in the sun. Lets not forget that most "good" ideas were once considered bad ideas.
List of ideas that was once considered bad.
Abolitionism. LGBT rights. Heliocentrism. Germ theory. Endless list.
The idea is that all ideas deserve discussion so that the good rises and the bad sinks.
Also, by allowing censorship when you have likeminded people in power, you also allow other people to censor in the future when power shifts.
This is just basic stuff you learn in philosophy 101. Unfortunately, we don't seem to be teaching basic principles in high school/college anymore.
People naively think that censorship keeps the bad ideas away. Most of the time, it is used to keep the good ideas away. Look at the censorship by the catholic church, soviet union, china, nazi germany, etc.
If your ideas have solid footing, you would be against censorship. It's when you can't defend your ideas that people advocate for censorship.
This is not something we can ignore and treat every ideas like they are the same.
In Europe, it's very common to have an exception to Free Speech when it comes to hate speech, and I never saw it being used in the "wrong" way and keep good idea away.
I'm not placated. According to Microsoft,
>After an initial review, we have concluded that this content incites violence, is not protected by the First Amendment, and violates Microsoft Azure’s acceptable use policy
If the posts were illegal, why isn't the poster being arrested? Where are the real police in this matter? Why are California cops deferring action to the Redmond internet police?
The solution is to require all hosting and domain companies to act as utilities, and require neutrality in regards to any content that's legal. They're not private forums or homes, they're the internet equivalent to the electricity company or the water company. They're the internet equivalent to a phone service provider or ISP.
Why do/did ISPs have to deal with net neutrality while large hosting companies and registrars get free reign? The power company can't cut off your service because you offended someone online or what not.
If you want to argue the internet is a place for free speech, then it needs places you can host content/register domains/whatever that act like a public square, not a shopping centre.
It is not there to force companies to serve you if they don’t want to.
Exactly this. When the First Amendment was authored, I don't think it was even conceivable that any organization besides a government or state church could effectively censor speech at scale. Now that we have "private" organizations that effectively have that power, it's perhaps time that the law be updated to reflect that change in facts and preserve people's right to free expression.
The internet shouldn't be another Zuccotti Park.
But that's not the case now. The platforms hosting your work are privately owned, the large community websites/social networks/whatever with millions or billions of users are privately owned, and a larger portion of real life locations people frequent are privately owned too.
The way society is going seems to be making the amendment less and less relevant as time goes on.
It was certainly much more expensive to distribute writing in 1800 than it is today. It would be much easier to stop a person from printing.
Because ISPs are, because of infrastructure (access to and across property, or access to scarce spectrum) requirements, naturally sharply limited, but anyone who connects to an ISP can host content.
One tactic for deplatforming undesirable sites is a DDoS -- if you put your home server online, what options do you have for DDoS mitigation if the big players in that space decide they're on the political side of those doing the DDoS?
There are a lot of people here completely in support of this, and it's not hard to trace an IP address to an ISP. When brigades start calling for ISPs to stop serving these sites, do you really think that's when they'll stand up for free speech?
Keep in mind in the span of just a few days, we've seen the jump from application platforms (Facebook, Google) to now infrastructure providers. The leap to ISPs is not that far, and something tells me when it happens, supporters will simply move the bar again.
These days, private companies have a _lot_ of power over what speech is allowed to be heard. Perhaps even more so than the government.
The solution is to simply have everyone host themselves. Even a moderate home broadband connection is more than enough to serve up your personal website.
Once you start relying on third parties you've lost. That's their property, not yours. When you claim it is yours and use government violence to enforce your wishes you are doing more damage to society than what you're trying to stop.
One bit of hilarious irony is that the only reason people in the US can access the Daily Stormer is net neutrality (now just de facto), yet the Daily Stormer was full of people calling for the end of net neutrality.
*free rein
However, if you look at the posts, you'll see a fair amount of unsavory content but no phishing links.
I feel like this was purposeful on the part of whatever political group made these reports.
Most likely, these hosting companies have automated anti-spam/phishing systems whereby, if a large number of reports come in, they will automatically send out an alert to the suspected offending party.
If Outraged Group X goes to the company saying "this site you're hosting has vile speech and so you should take it down," it probably has to go through an internal company process, which might have a slightly higher bar for acting on it.
However, if Outraged Group X falsely claims that "this site is hosting phishing links", something which might trigger an algorithmic response, then the company cannot easily reverse course -- because then the story becomes "Company ABC actively reverses course on allowing hate speech." The PR fallout locks them in.
Free speech has always been a very counter-intuitive process. Opinions that directly oppose yours or even your very person will rub you the wrong way, and to be intent on defending the right of your opponent to voice his opinion seems paradoxical, but we nevertheless conceded that this serves the greater good.
The recent Alex Jones incident was legitimized on the basis of dissemination of dangerous ideas. With Gab, the argument is that it is hate speech. The most compelling argument for free speech is that it removes the subjective and variable limits of speech and by extension, slippery slopes.
If you must know, while I do consider myself a political conservative, I cannot bear to listen to Alex Jones and I've never used Gab. I'm radically free speech which means I enjoy living in a world where literally everyone can say quite literally everything. Whether there is a slippery slope angle to all this is not even relevant for me. I want to hear the ideas of everyone who is broadcasting.
The very arguments I am expounding on are already controversial, but they weren't very long ago. Makes you think.
[...] I'm radically free speech which means I enjoy living in a world where literally everyone can say quite literally everything. Whether there is a slippery slope angle to all this is not even relevant for me.
Perhaps you'd like to explain the fundamental difference between the outcomes that lie at the bottom of these variably tolerable slopes.
Deleted Comment
Of course, I do wonder how far this goes. What happens if the DNS server refuses to host the IP lookup? Host your own DNS server? What if the browsers refuse to allow access to the site? Build your own browser? What if ISPs refuse to transfer the data over the wire? Make your own internet?
People have defended that this vs. Net Neutrality (at least the extent of the ability of an ISP to control what goes over its wires) as having a core difference, but it feels to me they are closer than people realize and the standards set by one can influence the other.
Sure, then the people who drove you off Facebook/Amazon/Apple iTunes/Youtube/Azure/EC2/Digitalocean/Cloudflare will go after your DNS registrar and the ISP/CDN/colocation-facility you're using. They won't give up and they'll ruthlessly go after every single commercial entity you do business with. After all, if it's not the government punishing you for speech, it's not censorship and it's fine!
That's the trick of this kind of lawfare -- having any kind of internet presence inherently involves relationships with private, non-governmental entities, and it's disturbingly easy to suborn them and shame them into refusing to do business with unsavoury people.
No need. It's easier if DNS registrars conspire to confiscate and/or refuse to sell domain names. This has already been done. SSL certs can, and have been, revoked. The ISPs haven't really been involved in these types of actions since the 90s, so I'm curious what their stance is. For now, if you've been run off the face of the WWW, you still have TOR hidden services or IPFS. Who knows what will happen when the ISPs get involved.
There are still some practical problems for hosting your own website with those "primitives", most importantly getting an internet connection suitable for hosting anything in the first place if nobody is willing to co-locate your servers. One would hope that with the proliferation of IPv6 and FTTx, actually hosting stuff from your basement would become more realistic.
That already happened; the Daily Stormer (yep, Nazis) was kicked off GoDaddy, Google, Tucows, Namecheap, DreamHost, etc. https://en.wikipedia.org/wiki/The_Daily_Stormer#Site_hosting...
Deleted Comment
even .ru revoked their domain
Deleted Comment
Dead Comment
Clearly Microsoft knew it was hosting Gab and what Gab was. There's a troll argument that suggests Gab is just "free speech twitter", but of course that's not the case: I've been screenshotting the front page for months, from a random anonymous account, and every time I've done it the front page was full of horrible Islamophobic, racist, and anti-Semitic crap. That and bot content.
My point here is: everybody knows that's what Gab is. Microsoft isn't pushing back on Gab's anti-Semitism --- without anti-Semitism, there is no Gab. They had specific harm reduction problems with a pair of posts, one of which, according to this article, was a call for violence directed towards Jewish people.
This seems to accurately describe "free speech twitter". All the content that would get people banned on twitter is displaced into sites like Gab and 4chan. The few occasions I've perused these sites supported this: there were lots of hateful speech against minorities, but a few instances of hateful speech against the majority. A very narrow minority were the latter, but as far as I could tell these sites upheld their commitment to freedom of any legal speech for all and didn't censor them.
partial quote:
" Gab is absolutely not a “white nationalist social media platform.” We are a free speech social media platform. We welcome everyone and have since the day we launched. My co-founder Ekrem is a Muslim Kurd in Turkey. Our Chief Communications Officer Utsav is an Indian and a practicing Hindu. Our “frog logo” that the media wants so desperately to tie to “pepe,” was inspired by Exodus 8:2–7 and was designed by our Creative Director Brandon, who is Jewish."
Want another example? Arguing with Ken White (Popehat) on Twitter, the official Gab account RT'd a white nationalist mocking Ken for having adopted Asian children.
https://twitter.com/Popehat/status/1026849669425520640
Out of curiosity, why?
Popular Posts
1) Thug Who Destroyed Trump’s Star and Bragged About It Gets Hit with Felony Charge, Facing Hard Time
2) Alex Jones Breaking: Son Of Terrorist Master Mind Caught Training Child Soldiers To Commit School Shootings Tune in M-F 8-11a
3) Candice Owens Tweets: BREAKING: 71 Illegal aliens were shot, trying to cross the border this morning! —Just kidding. It was actually black people in Chicago last weekend. You can go back to not giving a damn, liberals.
4) Military Support: Honoring Air Force Maj. Walter D. Gray who selflessly sacrificed his life six years ago in Afghanistan for our great Country. Please help me honor him so that he is not forgotten.
Wow, what a sad website. I don't quite understand the mindset/vitriol/hate/anger/etc here (the reason for it or need for it) psychologically speaking.
In any case I've taken my own simple screenshots of the repost in case it disappears, which I expect will happen eventually. If anyone absolutely needs them let me know.
How is a website sad just because it offers true free speech? I think it's sad the the perception of free speech is being tainted in such a bad light. Free speech is _only_ relevant if it covers the controversial things, too.
There happen to be plenty of people who go around collecting and rebutting white (and other ethnic) separatists/nationalists on Gab, and they aren't being repressed (in fact, some are receiving tips for the service).
Deleted Comment
"Users are prohibited from calling for the acts of violence against others, promoting or engaging in self-harm, and/or acts of cruelty, threatening language or behaviour that clearly, directly and incontrovertibly infringes on the safety of another user or individual(s)."
https://gab.ai/about/guidelines
Why would they give Little the option to delete his posts instead of banning him and removing the posts themselves? If I was in their position, it wouldn't even be a question of free speech considering he was breaching the ToS.
Overall this is similar to a runaway effect that negligence of the environment can have. At a certain point it will be too late to solve the problem i.e. control emissions or oppose this kind of authoritarianism under different circumstances. The very means to do so will either be impotent or impossible.
By then you are reliant on some new magical invention to resolve the problem, or a large death toll in the millions, which changes the entire landscape.
Corporations certainly have the right to do this, but they are only choosing to do so because there are strong winds blowing in this direction. It's important that they are opposed, before this paves the way for big mistakes to be made.
It's important that these kinds of ideas see sunlight. Ideas that promote tolerance, peace and common sense will win in the end, provided we keep talking. If talking comes to an end, we know what comes next. These ideas will only find more room to grow, and see greater validation when they are opposed in this kind of authoritarian manner.
Rather than advocating a strategy where we are relying on governments and corporations to solve these problems, we should be the ones fixing our societies and arguing against disgusting ideas such as this, with well reasoned arguments, or even humor, both of which have historically been great at winning ideological battles.
I am sure this crowd knows their Aaron Satie.
"With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably."
"When our enemies say: But we used to grant you freedom of opinion -- yes, you granted it to us, that is no proof that we should do the same for you! That you gave us that just proves how stupid you are!"
(source: https://de.wikiquote.org/wiki/Joseph_Goebbels and https://www.sr-mediathek.de/index.php?seite=7&id=37143 ; translation by me)
Our enemies are fascists who will take away our rights, therefore anyone suspected of eventually becoming a fascist will have their rights preemptively taken away so that it does not come to that?