Readit News logoReadit News
AlexandrB · 4 years ago
> Violent groups try to manipulate your anger and disappointment.

Anyone else find it weird that Facebook is calling themselves a "violent group"?

Joking aside, I "love" how the language in the warning pretends that Facebook is a perfectly neutral party here, in no way responsible for what content floats to the top of people's feeds.

> You can take action now to protect yourself and others.

Yes. That action is deleting your Facebook account.

heresie-dabord · 4 years ago
> Violent groups try to manipulate your anger and disappointment

"Political parties, foreign states that oppose democracy, and mercenary corporations all give us money for a chance to manipulate your anger and disappointment. While we at F-Book just mine your personal data."

Re-written for clarity.

aaron-santos · 4 years ago
Active vs passive voice in media is a way of avoiding responsibility. It shifts the subject from those who transgressed to the instrument of transgression, or to the person wronged.

What it should read is "Natalie, Facebook may have exposed you to harmful extremist content". Instead the passive voice subverts Facebook's role in deciding which content is served. It's like the content just magically shows up. A lot like how bullets or rockets also magically show up on their own in other stories.

azangru · 4 years ago
> I "love" how the language in the warning pretends that Facebook is a perfectly neutral party here

I saw people's screenshots of the "Help" section where Facebook elaborates on the subject of "violent groups". I'd say the language there is noticeably skewed towards the liberal left (if there is still any meaning left in those labels).

hairofadog · 4 years ago
Do you mean that from your perspective the language there is skewed toward painting the left as a violent group, or that the language there is skewed in support of the left?
InvisibleCities · 4 years ago
[Sober, well-researched article about the dangers of pervasive social media that calls for strong government regulation]

“This link has been associated with harmful extremist content. Are you sure you wish to proceed?”

Closi · 4 years ago
100% agree - let’s not forget that at the start of the pandemic the social media platforms called the lab leak theory an extreme conspiracy and some social platforms blocked links, now it’s one of the leading theories, so clearly and unsurprisingly these policies have already resulted in legitimate discourse being blocked.
bellyfullofbac · 4 years ago
I think I use the scientific method a lot, and dismissing a possible theory as "extremist conspiracy" before having all the answers is very unscientific, especially if the other theory (zoonotic spread) doesn't have bulletproof evidence yet. The fact that some scientists signed up to the zoonotic theory and dismissed the lab leak theory is weird, and the fact that social media banned the links means their censorship is flawed.

Then again, too many idiots think proof means "The CCP notoriously likes to hide stuff, therefore they're hiding the lab leak, therefore the lab leak theory is real, QED". That sort of disinformation does deserve a ban.

Now watch this comment get downvoted for... not following the herd?

azangru · 4 years ago
> now it’s one of the leading theories

I think the consensus is yet to move on this one. In a recent text I saw by someone who was trying to channel the majority opinion of the scientific community, they were calling it the less likely possibility, or something to that effect. I believe it's only in the public opinion that it became one of the leading theories, because the media suddenly decided that it was going to allow people to talk about it, and Jon Stewart went on Stephen Colbert...

peakaboo · 4 years ago
That's just one out of many more events the mainstream media has been disregarding or lying about. It may be extreme to call the mainstream media propaganda but then again, it may not.
mschuster91 · 4 years ago
> let’s not forget that at the start of the pandemic the social media platforms called the lab leak theory an extreme conspiracy and some social platforms blocked links

That was because back then the people spreading the lab leak theory were almost exclusively Alex Jones-level or worse crackpots.

Dead Comment

darthrupert · 4 years ago
Does this actually happen?
pydry · 4 years ago
If you try to apply ML to this problem.
ttt0 · 4 years ago
Considering the fact that they're fact-checking stupid memes, I wouldn't be surprised.

Dead Comment

gentleman11 · 4 years ago
Instead, they could just stop aggressively pushing extreme and upsetting content with their algorithms to increase engagement. I know people getting really depressed thinking all their friends have become extremists. In actuality, that’s just the only content that is being pushed to them in their feeds (and likewise, the only content that their friends are getting pushed as well)
nightski · 4 years ago
Really miss the days when feeds were just chronological lists of posts from your friends/family. No re-shares, suggested content, etc...

Ironically providing this and just charging like $1/mo would be far more effective than labeling content as extreme or not.

gentleman11 · 4 years ago
I miss when there was no feed. You visit their page if you wanted to know what someone was up to
apple4ever · 4 years ago
That's really the problem here. Instead of just the content that I specifically signed up for, I constantly get pushed for things I don't.
jaybrendansmith · 4 years ago
Why should they. They are in business to make money, and cannot effectively self-police. Imagine a time, way back after WWII when fascism almost took over Europe and destroyed democracy, when the grown-ups in charge took responsibility for fighting against intolerance and propaganda by carefully regulating communication through a government agency. For better and sometimes worse, but mostly better, this agency created LAWS that prevented the distribution of propaganda and obviously false and unbalanced news through regulation of the broadcast media, specific rules about what could be considered 'news', and the fairness doctrine. These grownups cleverly called the agency the Federal Communications Commission, whose job was not restricted to just the airwaves: it was to protect the people from hate groups, harmful extremist content, yes communist propaganda, and many other unbalanced or biased sources of information.
coffeefirst · 4 years ago
I can’t imagine anyone who’s susceptible to radicalization seeing that message and and having any kind of lightbulb go off.
pentae · 4 years ago
Do you think they really want to help people or is it more likely they are trying to get ahead of being broken up or overtaken by a government appointed authority
esyir · 4 years ago
I'll probably weight that warning as "might be ml interesting". Seeing that what I like (the old Web) is dying a slow death under the weight of modern privatised censorship, what they're marking as bad might become signal, of sorts.
overkill28 · 4 years ago
And the first, relatively mild warnings on cigarette packages probably didn't convince many to quit, nor did posting calorie counts on fast food menus make everyone switch to salads. But hopefully it's the start of a drumbeat where this type of content becomes less and less acceptable in society.

It's not the perfect analogy of course, but I think the solution to divisive, negative, nasty online content is largely a social one not a technological one.

tiagod · 4 years ago
I'm thinking it might have the opposite effect for anyone that doesn't trust Facebook
rndgermandude · 4 years ago
I don't know anybody who trusts facebook (the company).

But I do know a few people who are all to eager to trust everything that gets posted on facebook, and a sizable chunk of these people would indeed take such an "extremist warning" by facebook to mean that "they" are trying to hide things from the public.

0xy · 4 years ago
All of my right-wing friends and acquaintances who received this message mocked it and took it as more evidence of Facebook's bias, so I'd say it was inversely effective.

It will reinforce views.

m-p-3 · 4 years ago
IMO it will further antagonize them and the first reaction whoever sees this will be to go further down the rabbit hole.
lazyeye · 4 years ago
I see it more as a way of nudging, herding the cattle so to speak, to get people thinking in the way that big tech wants and finds useful. This is not the same thing as discouraging extremism. By any measure, the invasion and monetisng of peoples privacy like Facebook do is extreme.
swiley · 4 years ago
There should be PSAs warning users about exposure to Facebook and other ML curated feeds.
tu7001 · 4 years ago
Shows me "the content is not available in yr region"..., Poland here.
nivenkos · 4 years ago
Cue supporting taxing Tech companies and protectionism from US Big Tech becoming "extremist".
vixen99 · 4 years ago
Yet another example of a deeply patronizing stance on the part of those who deem 'ordinary people' too stupid to make their own judgments. They behave like some in the monarchy who spout questionable opinions but never engage in public debate. A similar mindset was evident in those senior EU officials in 2016 who privately told John Longworth, Chairman of the UK Independent Business Network that they were shocked British non-graduates were allowed to vote in the Brexit referendum.
sofixa · 4 years ago
> A similar mindset was evident in those senior EU officials in 2016 who privately told John Longworth, Chairman of the UK Independent Business Network that they were shocked British non-graduates were allowed to vote in the Brexit referendum.

Considering the results ( Brexit), and on what the campaign for that result was run on ( flaming shit and disinformation, like their wonderful slogans), can you blame him? It was an extremely complex topic, dangerously oversimplified by people with very questionable at best credentials and motives ( Nigel Farage, Boris Johnson), who lied and made empty impossible promises ( like a Norway style deal but with full judicial independence). Such a question, with such variable outcomes, should have never been asked to ordinary people. Ffs, many polls and trends showed many people didn't have a clue what it is that the EU does and what it means leaving it, even after the vote.

peytn · 4 years ago
What should have happened in your opinion? I can’t tell if you’re saying ordinary people shouldn’t have been asked such an important question, or what.
marderfarker2 · 4 years ago
Democracy, am I right?
hirako2000 · 4 years ago
Paternalism, that's exactly that we have been suffering since before we were born, and it's increasing. I think the Internet has given ways for individuals to get information so easily that there are now aggressive restrictions put in place. Of course in the form of self gratifications, sense of duties;fb is here to protect us.

I find these moves good, they show their cards, more and more people will hopefully start questioning the paternalism and finally questions all these taxes we are forced to pay as well. Our bank accounts or at least transactions that may be frozen any time due to merely suspect acticity. The need for a business specific license for each and everything one can do, even to basically cut bread in half for customers. Border controls not allowing us to import/export a gazillion products for x and y security reasons, and, i will say it, the near obligation to take some vaccine shots to be able to move around the globe.

ConceptJunkie · 4 years ago
The people whose attitudes don't align with Facebook will recognize this paternalism for what it is. Those people whose attitude align with Facebook won't. That's been my experience with these kinds of things.