Readit News logoReadit News
bradley13 · 3 months ago
Several European governments have jailed people for social media posts. Many Europeans support this - they don't understand how government censorship can quickly get out of hand.

As for falsehoods: some people will be mistaken, some people will lie, and sometimes sarcasm will be misunderstood. Why should anyone be liable? It is on each individual to inform themselves, and to decide what to believe and what to disregard.

Article 19 of the Universal Declaration of Human Rights: "Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."

It doesn't say "if your opinion is approved by the government". It doesn't say "if your opinion is correct". It makes no exceptions whatsoever, and that is what we need to strive for.

greenavocado · 3 months ago
Telegram founder Pavel Durov on Sunday accused France of asking him to remove some Moldovan channels from the social media platform ahead of the country’s presidential election last year.

In a statement on the Dubai-headquartered company, Durov claimed that the French intelligence services asked him “through an intermediary” to help the Moldovan government to censor “certain Telegram channels” before the vote on Oct. 20, in which incumbent President Maia Sandu secured a second term in office following a runoff held on Nov. 3.

He said a few channels were identified to have violated Telegram’s rules following reviews of the channels concerned and were subsequently removed.

“The intermediary then informed me that, in exchange for this cooperation, French intelligence would ‘say good things’ about me to the judge who had ordered my arrest in August last year,” Durov said, describing this as “unacceptable on several levels.”

“If the agency did in fact approach the judge — it constituted an attempt to interfere in the judicial process. If it did not, and merely claimed to have done so, then it was exploiting my legal situation in France to influence political developments in Eastern Europe — a pattern we have also observed in Romania,” he further said.

Durov also said that Telegram later received a second list of "Moldovan channels," which he noted were “legitimate and fully compliant with our rules,” unlike the initial list.

CONTINUED...

https://www.aa.com.tr/en/europe/telegram-head-accuses-france...

pickledoyster · 3 months ago
>Several European governments have jailed people for social media posts. Many Europeans support this - they don't understand how government censorship can quickly get out of hand.

I think quite a few Europeans have lasting and direct experience with totalitarian, oppressive regimes. Which might also explain why they have stricter (or simply more precise) laws governing expression – not as an oppressive tool, but as a safety valve for the society.

aydyn · 3 months ago
Silencing speech IS the oppressive regime.
wakawaka28 · 3 months ago
Nope, that's giving them too much credit. Censorship is oppressive except in very narrow circumstances. Free speech is actually the safety valve in society. Censorship is one of the hallmarks of a tyrannical regime, and is incompatible with democracy.
alphazard · 3 months ago
> not as an oppressive tool, but as a safety valve for the society.

This strikes me as just incorrect. What example from history shows totalitarianism being successfully avoided because of controls on speech?

The first item in the totalitarian playbook is controlling speech, and there are historical examples of that in every single totalitarian regime that I'm aware of.

gwd · 3 months ago
> It doesn't say "if your opinion is correct".

Opinions cannot be right or wrong.

> It makes no exceptions whatsoever, and that is what we need to strive for.

It certainly does. See libel / defamation / perjury / false representation / fraud / false advertising / trademark infringement.

xienze · 3 months ago
> Opinions cannot be right or wrong.

I’d say if you can be jailed for a particular opinion, someone has certainly made a judgement call that your opinion is wrong!

yupyupyups · 3 months ago
This is all true, with a few exceptions. For example, incitement to violence or false allegations that do serious reputational damage. It's not sustainable to allow exactly all speech.

Although I mostly agree, I just wanted to make explicit that nuance.

dfxm12 · 3 months ago
For example, incitement to violence

GP brought up people being jailed for social media posts, but didn't reference any specifically. In the handful of cases I found via a web search, the charges were related to inciting violence.

GP also brought up the Universal Declaration of Human Rights. Article 30 reads:

Nothing in this Declaration may be interpreted as implying for any State, group or person any right to engage in any activity or to perform any act aimed at the destruction of any of the rights and freedoms set forth herein.

When one exercising a freedom restricts another's ability to exercise theirs, it is reasonable to expect courts to get involved to sort it out.

kode95 · 3 months ago
> Many Europeans support this - they don't understand how government censorship can quickly get out of hand.

There are only a few European countries that jail people for wrongspeak, and I can't think of a single one of those countries whose population in general is in favor of such laws.

piva00 · 3 months ago
> It is on each individual to inform themselves, and to decide what to believe and what to disregard.

That's where the conundrum lies, requiring individual responsibility for protecting a whole society of potential bad actors using this freedom to break society apart.

How is it solved? No one knows, what we know is that relying on individuals to each act on their own to solve it won't work, it never works, we also see the effects on society from the loss of any social cohesion around what "truth" is, even though before the age of Internet and social media there were vehicles to spread lies, and manipulate people, this has been supercharged in every way: speed of spread, number of influential voices, followings, etc.

Anything that worked before probably doesn't work now, we don't know how to proceed but using platitudes from before these times is also a way to cover our eyes to what is actually happening: fractures in society becoming larger rifts, supercharged by new technologies, being wielded as a weapon.

I don't think government censorship is the answer, nor I think that just letting it be and requiring every single person to be responsible in how they critically analyse the insurmountable amount of information we are exposed to every day is the answer either.

txrx0000 · 3 months ago
> How is it solved?

It is solved by a democratic system that defines truth as "mutually observable phenomena", defines good as "the wishes of the people", and allows individuals to engage in free dialogue as a replacement for violence.

Good outnumbers bad, so the good will win, unless both sides think they're good in a 50/50 split.

This can happen even in that ideal society, because 50% of the individuals will eventually decide to have fundamentally different goals as the other 50%. In which case, I don't think we should hold that society together by force, but rather provide a mechanism for it to peacefully split into two, precisely to uphold the democratic principle of respecting the wishes of every individual.

Suppose half of those people are mistaken in a collective delusion, and their goals are in actuality aligned with the other half, but the other half have just failed so spectacularly at enlightening them (or perhaps the delusional half are so spectacularly delusional that they're impossible to enlighten). In this rare case of a perfect failure, they will quickly realize after the split and want to get back together, because reality is a harsh judge, and its judgements are ultimate.

Deleted Comment

croes · 3 months ago
>"Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers."

You do realize that this includes the freedom of people who get harassed online by others.

German journalist Dunja Hayali’s rights where violated by hate comments after social media and nuis sites misquoted her on her reporting on Charly Kirk‘s funeral

necrobrit · 3 months ago
> Many Europeans support this - they don't understand how government censorship can quickly get out of hand.

This argument can be made for government in general, although granted technology does make it easier for a smaller group to overreach. I'm a European and do hear your concern, but I feel comfortable supporting restrictions on speech _as long as_ there is also a functioning and just legal system that those restrictions operate within. Though there does seem to be a worrying trend towards technology bypassing the legal system and just giving enforcement agencies blanket access of late.

We all also have our own cultural biases and blind spots. I offer this not as whataboutism but as a different perspective: I'm _way_ more frightened by the authoritarian police culture (I base this on interactions with the police in a period I lived in the US) in the US than I am of the UK governments internet censorship. The internet censorship could do a lot of harm, but I think not as much potential harm as a large militarised police force willing to bust down doors on command from above.

tsss · 3 months ago
There has never been a functioning and just legal system in the history of mankind. Not to mention that what is "just" is very much up to debate.
epgui · 3 months ago
Are you talking about moderation or prosecution? You do see the difference right?
soco · 3 months ago
Worth mentioning exactly here the paradox of Popper: https://en.wikipedia.org/wiki/Paradox_of_tolerance sooo... what should we do about it, here and now? Because we are closing to need a decision. And let me remind you that people do get fined or jailed sometimes for making mistakes or lying.
jgeada · 3 months ago
I think this might be a misinterpretation of the word "responsible".

If a platform lies or spread malicious content, it seems people want the platform to have the liability and consequences for the malfeasance. That is what most people mean by "responsible".

Government sets the rules, and if someone fail to comply, there are consequences against those responsible. Government isn't responsible, it is holding them responsible.

philipallstar · 3 months ago
> If a platform lies or spread malicious content, it seems people want the platform to have the liability and consequences for the malfeasance. That is what most people mean by "responsible".

The platform isn't lying, any more than if I write "1 equals 2" on a letter and send it via the mail system to someone else is the mail system lying.

matwood · 3 months ago
What if the platform decides that people will send more mail if they piss them off by copying your letter and sending it to everyone in their service area? Is that still just you lying? At what point does the platform become responsible for amplifying what you said? Are you responsible when it's amplified to everyone if all you ever intended was sending it out into the void?
oscaracso · 3 months ago
Social media platforms do not operate like the mail or telecommunication infrastructure. Suppose that a clique of high-follower verified users on X formed a private discord channel in which they coördinate false but plausible moral panic news events in order to foment spontaneous violent acts against minorities (for added effect, perhaps by pooling their resources into generative AI tools), and that both platforms refused to address this by shutting down the channel, banning the users, or even reducing their reach on the timeline. While there remain reasonable arguments against governing this bad behavior through legislation, it is plain that the social media platforms would be implicated in the negative outcomes of the behavior to a greater degree than a mail carrier.

Deleted Comment

goalieca · 3 months ago
> If a platform lies or spread malicious content, it seems people want the platform to have the liability and consequences for the malfeasance.

Does the government then setup a ministry of truth? Who gets to decide that?

LeonB · 3 months ago
A “ministry of truth” would (I assume) be a part of the executive branch of government.

Whereas the creation of laws and the interpretation of laws are powers that the executive branch does not have, and are held separately by the legislative and judicial branches.

In a, well, y’know “functioning” democracy. Apparently.

mrguyorama · 3 months ago
The US already has that. What do you think the courts do?

People complaining about building a "ministry of truth" in countries with anything resembling a functioning legal system are just as clueless as people who cry about "government death panels" while private insurance already denies people lifesaving medicine right freaking now

jonathanstrange · 3 months ago
Judges. The question is mainly whether there should be some rules independent of those of the companies that the content must follow, and people who feel treated badly need to get their rights against those rules in a civil lawsuit, or whether more should be allowed first until there is a civil or penal lawsuit that might stop it. (It is already a mixture of both so it's a matter of degree.)

I personally prefer an emphasis on the first solution because it's better to combat the widespread lack of civility in social media, which I believe to harm society substantially, but I also understand the people who prefer the second model.

nemo44x · 3 months ago
This is exactaly right. To silence even offensive ideas is to appoint someone as the final arbiter of truth — something history shows to be dangerous. Truth doesn’t need protection — it needs criticism. Censoring ‘offensive’ or ‘sacred’ ideas kills the very process (open debate) that lets society correct errors & find truth, even if it's uncomfortable. Everthing else is dogma.
tiahura · 3 months ago
In most common law countries, juries fill that roll.

Currently, in the US, internet companies get a special exemption from the laws that apply to other media companies via the DMCA. If traditional media companies publish libelous material, they get sued. Facebook and Google get a "Case Dismissed" pass. Most people look at the internet and conclude that hasn't worked out very well.

dfxm12 · 3 months ago
At a minimum, keep in mind, perjury, libel and slander get litigated in courts of law. No ministry of truth is required in these cases.
Xylakant · 3 months ago
Are you implying that there’s no posts on social media platforms that are plain and verifiably wrong and that any such decision needs to be made by a government created ministry of truth? There’s no middle ground? Maybe such a thing like a court?

If I state here plain and as a fact that golieca eats little children for breakfast and slaughters kittens for fun, could @dang not look at both a statement from you and one from me and see if I have sufficient proof?

matwood · 3 months ago
Yeah, the US in particular functions on liability. If changes are made that make companies liable for the externalities from their platforms, they will almost instantly find ways to address the issues.
chii · 3 months ago
> If a platform lies or spread malicious content

but if the platform didnt go verify all content placed by users on it, does it count as "spreading" it?

I mean, there's nothing stopping anyone from publishing a book which spreads lies and malicious content - book banning is looked down upon these days. Why are the publishers not asked to be held to this same level? What makes a web platform different?

ecshafer · 3 months ago
This survey is too vague to be worthwhile. Sure some of these are scary how many people say "yes", but people say yes all of the time to vague sounding pleasantries. When they say "responsible" say specific actions. Should people be arrested for what they post on social media when its an opinion? Should platforms automatically analyze all messages and automatically remove messages that it deems not truthful? Should the platform be liable in court for falsehoods? People will answer very differently to specifics than vagueness.
LeonB · 3 months ago
Absolutely.

They hand-wave tremendously complex questions, in such a way that the respondent is free to interpret them any way you wish.

A similar survey question might be:

> I am thinking of a number. Is it, a) too high or b) too low?

txrx0000 · 3 months ago
Ideally, individuals are the ones responsible for moderating content. All filters should be implemented client-side. A person can have all the freedom to decide what they see, but not what anyone else sees.

The problem with group moderation is that it will form disparate, isolated groups that get larger over time. This gradually reduces the effectiveness of democracy and destroys social cohesion. The people in the same group see other groups as more and more evil as the groups get larger, so they don't talk to eachother as much. The effort required to talk to people in a different group becomes larger as the groups get larger, so fewer and fewer individuals are capable of dialogue, until eventually, there are just two echo chambers that absolutely hate eachother. And what follows is either violence or oppression, or both, because the conflict has reached the government level.

There used to be a cap on practical group size before the Internet (except in rare cases, those become authoritarian states), so this went unnoticed in democratic societies. But now there isn't. We ought to consciously realize that the base unit of a democratic society is an individual, and create policies that enable decisions at the individual level.

We ought to talk to people that we think are evil. And let them talk, too.

exabrial · 3 months ago
I’d like governments to instead enforce monopoly laws and the FTC to sue for crappy business practices. I don’t want them playing speech police.
LeonB · 3 months ago
Yes, absolutely.

The survey could say, “given that the existence of corporate monopolies demonstrates weak and non functional governments, should governments a) cede more power to the monopolies, or b) pretend to claw power back from the monopolies?”

markstos · 3 months ago
Moderation is a strength of the fediverse, because it is decentralized, with many moderators making possibly conflicting rules about relatively smaller amounts of content.

Moderators can block individual posts, accounts, or entire instances if they have objectionable alternate rules.

Don't like the moderation on some instance? Move to another.

warkdarrior · 3 months ago
Exactly. And moderators need to be responsible and liable for the content they allow through on their instances.
FarMcKon · 3 months ago
This is a false dichotomy presented to people. By framing this as 'Giant government, or giant business?" you are going to get crap answers.

None of these are one size fits all solutions, and there should be a mix. We have a working patch-work of laws in physical space, for a reason. It allow flexibility, and adjustments as we go, as the world changes. We should extended that to virtual space as well.

Age / Content Labeling and opt-in/ opt-out for some content. Outright ban on other kinds of content. A similar "I sue when you abuse my content" for copyright, impersonation, etc.

One size does not fit all, and is not how the real world works. Online shouldn't work much differently.

tiahura · 3 months ago
Two takeaways: 1. As per headline, rejection of state censorship. 2. Platforms should be responsible for user posts.

"Q17D. In your opinion, should each of the following platforms be held responsible or not responsible for showing potentially false information that users post? Base: Total sample in each country ≈ 2000."

Around the world, appx. 70% said yes. The rub, of course, is coming up with a framework. The poll suggests that the DMCA approach of no duty is widely unpopular. However, strict liability would ensure that that these industries go away, and even a reasonableness standard seems like a headache.