Readit News logoReadit News
mbesto · 4 years ago
He's right in some sense, but the context is important. The problem is that Facebook is giving the village idiot a megaphone. Facebook can't say:

- Amplify your commercial business message to billions of people worldwide.

AND at the same time

- Well its your individual choice whether or not to listen to the village idiot.

You guys gave them a megaphone, how do you expect society to behave?!

maerF0x0 · 4 years ago
> The problem is that Facebook is giving the village idiot a megaphone

While you're not wrong that it's giving the idiot a megaphone, it's missing the greater picture. it's giving _everyone_ a megaphone. The real question is why can't people discern the difference between the idiot and the non-idiot?

I'd also note that a big issue now is trust -- trust in "elites" (technocrats, wealthy, those in positions of power) has been declining for a long time. i think people are not so much seeking out the village idiot, but massively discounting "experts".

A list of things that come to mind which have broken trust: 60's saw hippies which wanted to break norms of their parents/grandparents, 70s saw vietnam war, breaking gold standard, 80s greed is good, iran contra etc, 90s tough on crime policies, y2k fears, 00s - iraq/afghanistan, 9/11 attacks, governmental data dragnet, manning/snowden/asange, Covid statements which did not pan out as planned...

People have good reasons to be skeptical of elites, but I think anti-corruption work is more important than trying to silence the idiot.

d1sxeyes · 4 years ago
That's also missing the greater picture. It's giving _everyone_ a megaphone... but giving the loudest megaphones to the people who can get most people to listen to them.

You'll have noticed on the internet that there's a tendency to prioritise engaging with things you disagree with (hell, half of my HN comments are because I felt motivated to write something to disagree with some OP at some point - even this one).

What that means is the traditional small-c conservative 'village elders', 'parish priests', and 'elected officials', who hold authoritative positions not because they're controversial, but because they historically represented positions of neutrality and consensus end up with quiet megaphones, and the madmen claiming the world is flat and there's a paedophile ring run out of a pizza shop end up with the loudest megaphones.

Half of the population is below average intelligence, and giving the wrong people the loudest megaphones has a devastating effect on society.

jonny_eh · 4 years ago
> why can't people discern the difference between the idiot and the non-idiot?

Because it's not idiots that are the problem, it's bad-faith actors, and they're very good at manipulating people. In the past they'd have to do that 1:1, now they can do it at scale.

matwood · 4 years ago
> The real question is why can't people discern the difference between the idiot and the non-idiot?

And here is the real problem with FB, the algorithmic feed. Normal life is pretty boring day-to-day, and doesn't trigger 'engagement'. Conspiracies, etc... cause an enormous amount of engagement. When a person is fed conspiracies all day by the engagement algorithm, even the most critical thinkers will start to drift. It works for the same reason advertising works, familiarity and repetition. The solution is never use FB, but that ship has sailed for most.

vlovich123 · 4 years ago
> The real question is why can't people discern the difference between the idiot and the non-idiot?

Societally we solve this through trust organizations. Individually, I have no way to validate the information every expert/idiot I might come across. So is “connect the frombulator to the octanizer but watch out for the ultra convexication that might form” gibberish or just your ignorance of the terminology in use in that problem domain? Most people don’t try to figure out how to navigate each field. Heck, even scientists use shortcuts like “astrology has no scientific basis so it doesn’t matter what so-called SMEs in those fields say”. So you rely on trust in various organizations and peers to help guide you. These structures can often fail you for various reasons but that’s the best we’ve managed to do. That’s why for example “trust the science” is a bad slogan - people aren’t really trusting the science. They’re trusting what other people (some times political leaders) tell them the science is. Add in bad-faith actors exploiting uncertainty and sowing chaos and it’s a mess.

Silencing the idiot is fine as long as your 100% certain you’re silencing someone who’s wrong and not just someone espousing a countervailing opinion (eg Hinton’s deep learning research was poo-pooed by establishment ML for a very long time)

topkai22 · 4 years ago
Facebook is not just a hosting platform, through the Facebook feed it exercises a great deal of editorial control over what posts/information is surfaced to users. So while Facebook might be giving everyone a megaphone, it doesn’t turn everyone the same volume. It needs to own that.
downWidOutaFite · 4 years ago
Erosion of trust in elites just so happens to also be a long-term goal of polluters, quacks, scammers, and other powerful parasites of the common wealth when they run up against government or science.
blablabla123 · 4 years ago
I think in general Facebook has a bias towards inflammatory posts - and other platforms for that matter as well including HN actually. Also it's easy to blame the village idiot for everything, but I don't think Donald Trump or Alex Jones are village idiots. They are surely idiots but left the village quite some time ago and gained popularity before Facebook (InfoWars was founded 1999) - although FB surely was an accelerator stage.

That said, the village idiot is harmless and I think aristocracy (rule of the elite) is definitely not the solution. But what is true that the normal filters in an offline community haven't been translated online yet.

didibus · 4 years ago
> it's missing the greater picture. it's giving _everyone_ a megaphone

I think this can be argued against, because Facebook does recommendation and algorithmic curation.

Even if Facebook didn't purposely tweak things to propagate disinformation, you could say it is easy to manipulate their algorithms to disproportionately push the information.

So for me it's a case of Facebook not doing enough to fight potential abuse on their platform.

There's an element of responsibility here, because we are prone to some material more than other. There are primitive instincts in us, and content designed to take advantage of that is parasitic, and it is manipulative and addictive in that sense.

Crazy theories, appeal to emotions, controversy, nudity, clan affiliation, and all that are ways to take advantage of our psyche.

Even a smart person is as smart as the data more readily available to them. If the only thing about gender psychology I ever heard about was Jordan Peterson because he's been recommended to me, even if I'm the smartest most reasonable person, this is now the starting point of my understanding and thoughts around gender psychology.

So I think a platform that is optimized to show information that is most designed to make people susceptible to it, and that also targets information to the most susceptible people to present it to is by design going to result in the outcomes we're seeing.

onlyrealcuzzo · 4 years ago
Facebook (like News) is entertainment. People don't select entertainment for accuracy.

The village idiot (that's successful on Facebook) has self-optimized for being catchy - that's why people are listening.

BeFlatXIII · 4 years ago
Not to mention vaccine hesitancy because "when was the last time you got healthcare for free?"
kelnos · 4 years ago
> it's giving _everyone_ a megaphone.

Are they, though? It seems like FB amplifies things that they think will generate more engagement and "stickiness". Sensational things that cause outrage tend to do that more than cold, hard facts. I would not at all be surprised if misinformation gets amplified orders of magnitude more than the truth.

dredmorbius · 4 years ago
Idiots engage.

Reason doesn't.

burnte · 4 years ago
> You guys gave them a megaphone, how do you expect society to behave?!

Considering most of humanity is... challenged when it comes to thinking critically, this should have been an entirely forseeable outcome. I agree it's society's fault, but Facebook is part of society. They watched how their tool was being usde by these people, and ENHANCED the reach of those messages because it was good for Facebook. Facebook is the microcosm of the object of it's blame. Idiocy writ large in recursion.

Hokusai · 4 years ago
> most of humanity is... challenged

Most, no. Everybody is blind to one perspective or another. Also, time is limited and attention is limited. Do not think that others are just stupid because their focus or knowledge does not overlap with yours.

"Those people" does not exist. It's just an illusion of your own limited perspective. We are on this together and calling people stupid not it is true, not it helps.

Deleted Comment

Jenk · 4 years ago
> this should have been an entirely forseeable outcome

It was. It is. It always will be.

Could you imagine the outrage had the authorities even attempted to prevent Facebook et al?

danpalmer · 4 years ago
Not saying you're wrong, but to take a slightly more charitable view on humanity: Facebook exploits well known human behaviour to amplify content.

It's (unfortunately?) human nature to share shocking things, it may have even been evolutionarily advantageous at some point. Using algorithms to exploit this behaviour at a scale never before possible is harmful to humanity. No idiocy required.

api · 4 years ago
It's much worse than giving the village idiot a megaphone. Facebook (and most other socials) prioritize content to maximize engagement, and (big surprise) the village idiot maximizes engagement. Facebook is a machine tuned specifically to spread hate and bad ideas because that's what maximizes the time people spend on Facebook.

I thought of a good analogy a while back. Lets say someone walks past you and says "hi" and smiles. Lets say someone else then walks past you and punches you in the face. Which interaction maximizes engagement? Well that's the interaction and content that social media is going to amplify.

Social media companies are the tobacco companies of technology. They make billions by lobotomizing the body politic.

soyiuz · 4 years ago
True, but the financial incentives of many (most?) companies doesn't align with public benefits (see pollution, plastics, diet etc.) Why is social media singled out in our demands for them to act morally good, instead of just profitable?
ricardobayes · 4 years ago
Supposedly they were working on detuning this effect.
freediver · 4 years ago
> Lets say someone walks past you and says "hi" and smiles. Lets say someone else then walks past you and punches you in the face. Which interaction maximizes engagement?

Likely the first one. Could also lead to a literal 'engagement'.

908B64B197 · 4 years ago
> The problem is that Facebook is giving the village idiot a megaphone

What's interesting is that before Facebook, the only people who could afford a megaphone were either state sponsored medias or billionaires who owned TV stations and newspapers.

For the ordinary citizens, the only way you could be heard was to write a letter to the editor of your local paper. If the state/billionaire/editor didn't like you, your views or anything really (your skin color perhaps?) it would simply not get published, period.

With Facebook a lot of gatekeeping simply disappeared. It's interesting to see who has an interest in regulating Facebook and bringing back the "good old days" of medias.

s1artibartfast · 4 years ago
I think a better analogy is Facebook gave society a window into each others lives, and people can't look away.

Facebook prioritizes what people want to see, and people want to see train wrecks and inflammatory content.

disambiguation · 4 years ago
> people want to see train wrecks and inflammatory content.

I'm starting to believe this more and more, but what I can't understand is why? We know it has no real "nutritional value", yet we crave it anyway.

Are we just bored and desire entertainment and drama?

What's the evolutionary drive for drama anyway?

cgriswald · 4 years ago
People want inflammatory content like a moth wants a flame. Facebook amplifying a signal from the lunatic fringe preys upon the need of non-lunatics (or different-thinking lunatics) to argue against ideas they consider dangerous or just wrong. As a side-effect, it makes the ideas appear more mainstream, which has the effect of making the ideas more popular. This further increases the compulsion of non-lunatics to address the ideas.

I'm not sure if that qualifies as being what people 'want' or not, but it seems like it's profitable.

TigeriusKirk · 4 years ago
I have no problem with them saying both things at the same time. You're responsible for what you give your attention to, and so is everyone else.
fragmede · 4 years ago
Ultimately, yes, but that's a rather short-sighted position to take when there's an cadre of psychologists and other highly-trained people who's entire job is to entrap you further, just so someone can make (more) money.

Eg when you buy items at the grocery store, do you consciously examine all options, including the items on the bottom shelf by your feet, or do you just go for the items at eye level, and are thus tricked by a similar group of psychologists into buying the product you've been trained to want. And even if you, personally, do, there's a reason why product companies pay supermarkets to have their products at eye/arm level - it works.

SamoyedFurFluff · 4 years ago
Sure, but also Facebook has a bunch of doctorate-having engineers and psychologists dedicating hundreds or thousands of hours to figure out a system that gets for me to give my attention to Facebook, whereas I’m one dude who doesn’t even have a graduate degree who gets tired and bored and struggles to sleep sometimes.
FpUser · 4 years ago
I am not sure how it goes for the average person. Myself: I just do not go to places where village idiots tend to accumulate like FB or if I do (hard for me not to watch youtube) I just completely ignore all that crap.
geodel · 4 years ago
And that might be most reasonable thing to do.

It seems like lot of folks here allude though not exactly say that they should be in position to decide on who is "idiot", "bad-faith", "anti-science" and so on.

twblalock · 4 years ago
Should our society have free speech, or free speech for everyone except idiots?

If you agree with the second formulation, who do you think ought to be in charge of deciding who the idiots are? Surely Mark Zuckerberg would not be your first choice.

Maybe there is a third option: no free speech for anyone, all speech must be moderated for lies and misinformation. Is that what you want? In that case, who gets to decide what is true and what is not? Surely Zuckerberg wouldn't be your first choice for that either, right? And what should happen when Facebook blocks "misinformation" that turns out to actually be truthful?

Those who want Facebook to regulate "misinformation" and gatekeep who (and what) is allowed on the site need to admit that they don't actually believe in free speech -- they believe in limited speech regulated by corporations.

JPKab · 4 years ago
Take any of these arguments about Facebook, replace "Facebook" with "printing press" and everything still makes sense, which tells you what this really is:

Cultural elites wanting to control what their perceived inferiors think, believe, and most importantly, vote for.

The same class of people who wanted to regulate the printing press in Europe during the 15th and 16th centuries are the ones who want to regulate the internet today.

lmilcin · 4 years ago
I believe speech should be free but people should be responsible for their speech.

People behave completely differently when there are consequences to what they say.

Speech for "everybody but idiots" is not free speech.

michaelmrose · 4 years ago
Facebook should ban or suspend accounts which spread objective untruths that will tend to be harmful if spread.

You can have your free speech on your own website.

tenebrisalietum · 4 years ago
I want free speech for everyone except idiots.

> who do you think ought to be in charge of deciding who the idiots are?

Think about it. Engineering disciplines have mostly solved this issue. Lets take structural/civil engineering and something that affects many people - bridges. Through a combination of law, codes, and government, not any joe schome can build a bridge. Existing bridges generally work well and can be trusted. Sometimes bad things happen like the FIU collapse, but generally that's very rare.

I don't understand why there can't be a group of people, large or small educated and from diverse backgrounds, that can set basic standards on what is and is not misinformation, with due-process like things such as appeals, etc. It's not an impossible task.

> Those who want Facebook to regulate "misinformation" and gatekeep who (and what) is allowed on the site need to admit that they don't actually believe in free speech -- they believe in limited speech regulated by corporations.

If you're going to use a third party for communication and that third party is not owned by the people (i.e. a government entity) then it follows from the above statement that you don't believe in private property rights.

canistr · 4 years ago
Twitter and Youtube, sure.

But the blast radius of a Facebook post doesn't have the same reach given the majority of posts go to your explicit network of connections. Unless you're specifically referring to Facebook Groups? But then are we certain it's different from Reddit or other forums?

itake · 4 years ago
Facebook Groups and Pages create ways for people to share content, triggering exponential growth (e.g. user shares meme to their page so that their friends see it. Their friends choose to re-share. wash. rinse. repeat.)
sharadov · 4 years ago
That's too simplistic and naive, their algorithms amplify what will get the most clicks!
foobarian · 4 years ago
I think it's not so much Facebook alone but the entire Internet. The connectivity between humans is suddenly increased manyfold, and reaches much wider. Imagine using a graph layout tool on a giant graph with only few localized connections. Likely the picture will have evenly distributed nodes without much movement. But then as you dump all these new edges onto the graph, the nodes start to move into rigid clusters separated by weak boundaries. I think this is what's happening with the red/blue, vax/antivax etc. groups.
kelnos · 4 years ago
The internet alone doesn't connect people. Remove things like Facebook and Twitter, and how do you get this giant interconnected graph with few localized connections?
servytor · 4 years ago
Yeah, I always hear people talking about the great "global village" where everyone is 'connected', but I have to admit I am against it. I don't want to be prank called.
commandlinefan · 4 years ago
> the village idiot

One man's terrorist is another man's freedom fighter.

jeffrogers · 4 years ago
Right. Prior to social media, people were vetted many ways and in every context in which they gained an audience. (e.g. earned standing in social settings and community groups, promotions at work, editors of one sort or another when publishing to a group, etc) Audiences grew incrementally as people earned their audience. Social media removed all that vetting and it inverted the criteria to grow an audience. Sensationalism was rewarded over thoughtfulness. So one of the most important tools we've always relied on to judge information was removed. Hard to believe, as intelligent as these folks at Facebook/Meta are said to be, that they don't understand this. Feels disingenuous.
LeifCarrotson · 4 years ago
It is difficult to get a man to understand something when his salary depends upon his not understanding it.

- Upton Sinclair

antman · 4 years ago
The problem is that facebook is giving people earplugs. Biases and minority opinions get clustered together in huge echo chambers by eliminating mean societal influence.

This has assisted valid and invalid minority opinions to be heard.

What wasn’t there was critical thinking on behalf of the people who were already overwhelmingly exposed to mass political marketing and had developed a pseudo Asperger response. I will agree for once with the facebook exec, political philosophy has pretty much come to the conclusion that since there is not a unique definition of good or bad, there is not an algorithm that can do it.

Deleted Comment

JPKab · 4 years ago
The problem is that Gutenberg is giving the village idiot a megaphone. Gutenberg can't say: - Amplify your commercial business message to billions of people worldwide. AND at the same time

- Well its your individual choice whether or not to listen to the village idiot.

You guys gave them a megaphone, how do you expect society to behave?!

freediver · 4 years ago
So should there be a special tax on "megaphones" like Twitter, Facebook or YouTube? What exactly is the legal framework under witch these companies could be scrutinized? Normally the manufacturer of megaphones does not get sued when a person uses it to promote hatred on a village square.
sverhagen · 4 years ago
I think the megaphone is thus more of a metaphor than it is an analogy. Or at least, like most analogies, it breaks down under even the lightest pressure. For it to be an analogy, it'd have to be a megaphone manufacturer that also brings the audience together. Maybe Facebook is the megaphone AND the village square AND then some.
noahtallen · 4 years ago
That’s what’s challenging about this situation. We’re experiencing a fairly new problem. It hasn’t before been possible for a member of society to communicate with all other members of society at the same time, nor has it been possible for a member of society to get addicted on a curated feed of random (sometimes anonymous) folks spreading their ideas globally.

All of these things seem new to me:

- Global, direct communication with all members of society.

- Addictive design patterns in software.

- AI-curated news feeds based on increasing engagement.

- Anonymous conversations.

Since it’s new, society doesn’t have frameworks to think about this kind of stuff yet.

andrew_ · 4 years ago
I don't understand how a targeted tax would help at all here.

Dead Comment

betwixthewires · 4 years ago
It's not a megaphone, the only people that can see it are literally the village idiot's friends and family. It's gossip within your social circle.
seanmcdirmid · 4 years ago
But village idiots can share content into their social circle from other more popularly known village idiots.
kjgkjhfkjf · 4 years ago
I wouldn't blame megaphones for the fact that "idiots" use them. Nor would I expect megaphone manufacturers to dictate what messages can be amplified using them. Nor would I expect megaphone retailers to determine somehow whether a person was an "idiot" before selling them a megaphone.

If someone uses a megaphone in an anti-social manner, that's a matter for the police to handle.

michaelmrose · 4 years ago
Analogies are nearly useless in making an argument Facebook is an online platform with real time access to users communications and metrics and analysis of how it's used which allow it to make reasonable predictions on how it's going to be used in the future.

Comparing it to dumb hardware is ridiculous.

Their ability to predict the negative effect of amplifying crazy provides a moral imperative to mitigate that harm. In case you don't understand there is a difference between the platform allowing Bob to tell Sam a harmful lie, and letting Bob tell 200 people who tell 200 people ..., and different yet from algorithmically promoting Bob's lie to hundreds of thousands of people who are statistically vulnerable to it.

lijogdfljk · 4 years ago
So i think this is a breakdown of our previous mindset on the matter. I don't know what future is "right", what the answer is.. but i think it is important for us to at least recognize that in the past, a crazy person on the street corner was limited quite a bit on velocity.

This megaphone is a poor example imo. A far better example would be broadcast television. We're now broadcasting everyone straight into not just American homes, but world wide.

So i ask, because i don't know, how does broadcast television differ from a megaphone in requirements? What responsibility is there on broadcast television what doesn't exist for a street corner?

kelnos · 4 years ago
Is it a problem, however, if the megaphone manufacturers specifically look for people who spread misinformation, and sell them the loudest megaphones with the most reach?

FB has not directly done that, but they have consistently refused to acknowledge that selling the biggest megaphones to the people who create the most "engagement" (aka money for FB) tend to be the types of people who generate false information and outrage.

Their publicized efforts to shut down fake accounts and pages set up specifically to spread misinformation is perfunctory, and simply something for them to point at and say, "see, we're doing things to fix the problem", when they're merely playing whack-a-mole with symptoms, know what the root of the problem is, but refuse to fix it because it's their cash cow.

tspike · 4 years ago
Would you expect megaphone manufacturers to give souped up models capable of drowning out other megaphones to only the most controversial, destructive people?
baq · 4 years ago
it isn't the village idiot, it's the insidious manipulator that influences village idiots at industrial scale now.
alpineidyll3 · 4 years ago
Internally Facebook works aggressively to combat covid misinformation: source I work at fb. Literally most of the commonly used datasets are about it. It's easy to hate and hard to understand.
mikem170 · 4 years ago
What about their algorithm?

Facebook decides what to show people. They could show you your friends posts in chronological order, and/or let people have control over what they see.

But no, Facebook decides what people see. Therefore they have some responsibility for the spread of misinformation.

andrew_ · 4 years ago
This is really the crux and it doesn't get as much attention as it should.
dntrkv · 4 years ago
It doesn't get enough attention? The "algorithms" are all anyone talks about when it comes to this issue. I think people put way too much weight on them.

Once you have enough people participating in a single conversation, from all walks of life, the discourse will go to shit very quickly.

Look at Reddit as an example. Small subreddits that are focused on a specific hobby are usually pleasant and informative. Large subreddits that cater to a large audience are no better than the comment section on a political YouTube video.

saddlerustle · 4 years ago
It's really not. WhatsApp has just as big of a misinformation problem without any sort of algorithmic ranking.
freediver · 4 years ago
And people decide to use Facebook. I am not trying to defend it, but blaming it 100% on Facebook is not fair. Even if their algorithms were perfect to amplify misinformation, there still needs to be enough people reading and sharing content for it to have an effect.

A solution could be paying for Facebook, where both the number of people and incentives would change.

mithr · 4 years ago
> blaming it 100% on Facebook is not fair.

The problem is that humans are never 100% rational. If the audience for Facebook was purely rational robots, then sure, you could make the argument that since they should just be able to stop caring about these problematic things, and the issue will go away, it's not Facebook's fault that they haven't done that.

But given we are talking about humans, once Facebook has spent considerable time and money studying human behavior and society in general, exactly in order to figure out how to maximize their own goals over anything else, I think they should take the blame when there are negative side effects to doing so. Saying "well if society just stopped caring about (e.g.) negative content it'd be fine, so it's society's fault" is misdirection at best and ignores both the concentrated effort Facebook has spent on achieving this outcome, as well as the hoops they've spent the past few years jumping through to defend their choices once people started calling them out on it.

nowherebeen · 4 years ago
> And people decide to use Facebook

I don’t choose to use WhatsApp but I have to because that what my family members use and they aren’t tech savvy enough to use anything else. So no, it’s not a simple choice. Once a product gets saturated in the market, it gets very difficult to replace it.

hardtke · 4 years ago
Facebook doesn't really decide what you see, but instead optimizes what you see to maximize your engagement. If you never engage with political content or misinformation, you generally won't see it. Once you start engaging, it will drown out everything. What they could provide is a "no politics" option, but I wonder if anyone would utilize it. There was an old saying in the personalized recommendations world along the lines of "if you want to figure out what someone wants to read, don't ask them because they will lie." For instance, if you ask people what kinds of news they want they will invariably check "science" but in fact they will mostly read celebrity gossip and sports.
mithr · 4 years ago
Facebook decides what you see. That they have created an algorithm that "maximizes engagement" is just another way of saying that they've decided what you should see is what they believe will keep you most "engaged". They could choose to use a different metric -- it is entirely their choice.
wvenable · 4 years ago
Facebook as experimented with a number of different options to clean your feed but ultimately they never get deployed because they all decrease engagement.
kgin · 4 years ago
It’s not false that there is a societal problem that is not unique to Facebook.

But that sidesteps the question of what responsibility they have as a company whose profits are, at minimum, powered by that problem, if not exacerbating the problem.

“Privatize the profits, socialize the costs” is not sustainable.

jensensbutton · 4 years ago
> whose profits are, at minimum, powered by that problem

I don't think it's established that that's the minimum. Facebook usually argues that it's a small minority of their content and I don't see any evidence against that (it just gets a lot of scrutiny). It seems like if you magically removed all the "bad actors" they'd make just as much money.

joshenberg · 4 years ago
Exactly. Running 'the world's largest vaccine information campaign' rings hollow when it's really a mitigation effort. That's akin to saying that the Valdez tragedy and subsequent clean-up made Exxon the top environmentalists of '89.
sneak · 4 years ago
> “Privatize the profits, socialize the costs” is not sustainable.

The church (as well as the integrated state) has been doing it for thousands of years. I think treating the general population as a somewhat renewable resource to be mined/harvested has a longstanding tradition and history of being one of the most sustainable things in human history.

Depending on how tax money is spent (i.e. in ways that benefit a subset of society, rather than all taxpayers) this is perhaps the most common and longstanding tradition we human beings have.

geodel · 4 years ago
Indeed. Many think if it sounds like feel-good rhetoric then it must be true.
cletus · 4 years ago
Look at video games, particularly on mobile. I mean they aren't even games anymore. They're just metrics-optimized psychological-trick machines to extract the most money from you $1 at a time ie in-app purchases and pay-to-win. These aren't games: they're engagement bait to bring you and your wallet back each day.

Why do we have this? Because people suck and it just makes way too much money for anyone not to do it. Why didn't we have this 20 years? Because the technical capability wasn't there.

It's really no different here. Communication and messaging costs have really gone down to zero. If it wasn't FB, it'd be someone else. There's simply too much money with very little costs in engagement bait, whether or not that's the intent of the platform or product.

And yeah, that's the case because people suck. Most people aren't looking for verifiable information. They're looking for whatever or whoever says whatever it is they've already chosen to believe. That's it.

I'd say the biggest problem with FB and Twitter is sharing links as this is such an easy way for the lazy, ignorant and stupid to signals their preconceived notions to whatever audience they happen to have. But if Twitter or FB didn't allow sharing links, someone else would and that someone else would be more popular.

I honestly don't know what the solution to this is.

GuB-42 · 4 years ago
> Look at video games, [...] I mean they aren't even games anymore. They're just metrics-optimized psychological-trick machines to extract the most money from you $1 at a time [...]

Did you just describe arcades?

notacoward · 4 years ago
I don't think so, and I used to spend a lot of time in arcades. For one thing, everyone had to pay the exact same amount to play. Good player, bad player, whatever. There was no "free tier" to get you hooked before things suddenly got much harder, and when they did there was no way to pay more to make things easier (though toward the end some games did let you continue by pumping in more tokens). Every game was also self contained. There was no persistent state that you were in danger of losing if you didn't keep checking in day after day, week after week. Fortunately, the games were also cheap. Even when I was really poor, I could afford a dollar for several tokens that were enough to play the games I liked just about as long as I could stand. Hour-long games of Battle Zone turned into all-afternoon games of Joust. I could turn a game over to someone less skilled, go back to my apartment, eat lunch, come back, and pick up again before they'd managed to exhaust the lives I had accumulated.

Arcade games were certainly meant to be enjoyable, and to keep you playing, but they were nowhere near the dark-pattern psychological minefield that modern games - especially mobile games - often are.

supertrope · 4 years ago
Arcades may have heavier users but not any “whales” who are using mom’s credit card.
clavicat · 4 years ago
Or casinos.
themacguffinman · 4 years ago
The solution is probably not to freak out so much and cultivate your (metaphorical) garden. People will adapt as they always have. This exact moral panic plays out every time the economics of information changes as far back as the printing press, which a 16th century scientist warned would create an overabundance of information that was “confusing and harmful” to the mind [1]. Even further back, Socrates apparently warned of the dangers of writing (the danger being forgetfulness). I don't think they're strictly wrong. They just overvalue the benefits and overlook the problems of older technology. Unalloyed good is not the bar. Technology is never an unalloyed good because we are not an unalloyed good.

Yet each and every time, "but this time it's different though", "but this is unprecedented".

Yes, the world of information is bigger and more complex. Yes, we also invent bigger and more complex tools to manage our bigger and more complex world. And the world is richer for it. There's never been so many quality video games and insightful media than today.

[1] https://slate.com/technology/2010/02/a-history-of-media-tech...

novok · 4 years ago
There are still many video games that are not click boxes, and if you watch what kids are actually into, they tend to be actual games like roblox, minecraft, among us and fortnight. Even the small mobile games they end up being mostly games vs. metrics optimized click games.
Exendroinient · 4 years ago
So the biggest obstacle to solving anything about this world is stupidity of the general population. People can't stop choosing the most stupid things with their wallets and attention.
bmitc · 4 years ago
As harsh as I am on humanity, it's not entirely people's fault. We have primate emotions that have been grossly outpaced by our technological development. This is why I've become somewhat anti-technology despite still working on technology because it stretches us to lifestyles that are distinctly inhuman.

Humanity simply does not have the emotional capacity to handle the technology we create. It never has and never will, and it's just that software has greatly amplified this inability. This is why mental disorders are skyrocketing. We're building emotional prisons with technological walls and bars.

gretch · 4 years ago
Not everything that's stupid is harmful.

It's fine if everyone is stupid, or most of us choose to engage in stupid things from time to time. I paid $17 to see Transformers. That was pretty stupid, but oh well, no one got hurt.

People want to be entertained and engaged - I do not think they aim to harm. So I think we need to develop alternatives to entertain people - these alternatives can still be stupid, they just shouldn't be harmful.

Bendy · 4 years ago
“The NRA say that ‘guns don’t kill people, people kill people’. I think the gun helps. Just standing and shouting BANG! That’s not gonna kill too many people.” - Eddie Izzard, Dress to Kill (1999)
dandotway · 4 years ago
Homo Sapiens have been murdering each other on this planet for at least 10,000 to 100,000 years, and have only used burning-powder projectile launch tubes for roughly the past 1000 years. (Poison darts launched from a blowgun are a more ancient form of killing tube.)

When convenient projectile launching killing tubes aren't available, Homo Sapiens will rapidly revert to 10,000+ year old murder methods, and thus a husband inflamed with murder-rage who just learned his wife's ovaries have been voluntarily fertilized by another man's semen will not infrequently use punches or a nearby blunt object (hammer or rock) to fracture her skull and destroy her brain function, or use his hands to crush her windpipe, or bleed her out with a knife. This has been happening essentially every year for at least the past 10,000 years. If his wife had been armed with a handheld projectile launching killing tube she could have defended herself, but women frequently don't carry projectile tubes and frequently vote for restricting access to projectile tubes, because projectile tubes are loud and scary and make them feel unsafe.

mcguire · 4 years ago
Homo Sapiens have been killing each other on this planet for at least 10,000 years, and only used nuclear weapons for roughly the last 75 years.... If his wife had been armed with an M-28 Davy Crockett firing an M-388 round using a W54 Mod 2 with a yield of 84GJ, she could easily have deterred him, but she would not be legally allowed to carry military weapons.

Maintain the purity of your bodily fluids, friend!

teawrecks · 4 years ago
Why use many word when few word do trick?
adolph · 4 years ago
Izzard is from England which has a high degree of gun control. The murder rate doesn't seem to be strongly correlated to regulation, however. This lends less credence to Izzard's conjecture. Maybe people who may murder will use whatever tool is available or aren't concerned about breaking gun laws?

https://www.macrotrends.net/countries/GBR/united-kingdom/mur...

https://en.wikipedia.org/wiki/Firearms_regulation_in_the_Uni...

root_axis · 4 years ago
The murder rate seems a lot lower than the US. Also the "murderers aren't concerned about gun laws" line is specious; a criminal isn't concerned about laws by definition - that doesn't invalidate the reasoning behind passing a law.
Bendy · 4 years ago
So long as it’s mightier than the sword.
teawrecks · 4 years ago
Your link shows the US has 4x the homicide rate as the UK. Meanwhile, Japan has incredibly restrictive weapon laws (including knives and swords), and they have a fraction of the rate of everyone else.

And before you point to Switzerland, I am all for gun ownership by certified, well trained, responsible citizens. But the US doesn't have that. Either decrease access to guns, or enact 2 years of compulsory military service where you are trained to respect your weapon and know precisely when and how to responsibly use it AND store it. If you do neither, you get the US.

And in either case, we need to improve the mental wellbeing of everyone in the US by giving more people access to "free" healthcare and not stigmatizing mental health.

actuator · 4 years ago
Title is really poor compared to the content of the article.

In any case, he is right. Look at the pattern, any large social network has these issues, which more or less seems like is related to how people interact. Twitter is massively toxic, Reddit is. Back in day Tumblr which was not current social media huge also used to have content Facebook gets blamed for.

Give a platform for people to publish and share and every opinion has the chance to be there.

It also doesn't have to be a massive broadcast platform, messaging platforms with small communities in the form of groups have these issues on a smaller scale. Though broadcast does make it worse.

knuthsat · 4 years ago
Whenever I see toxicity, I always assume it's a young individual. I remember myself in teens, always up for some trolling on forums.

Today, I just don't have a need to do such things. Whenever I encounter this weird behavior, I just stop interacting, because I have a feeling it's some 13-16 year old wasting my time.

delusional · 4 years ago
The maker of the gun may not be held solely responsible for murder, but we surely have to consider if we want guns in the hands of murderers.

Facebook isn't arguing this from a neutral standpoint. They are arguing from the position of the company that stands to lose their business if society decides murderers shouldn't have guns.

actuator · 4 years ago
But that's the exact issue no.

Social media platforms create immense value as well. I might not be a user but I can see how it helps a lot of people as well.

Do we enforce license on people to use social media platforms?

abc_lisper · 4 years ago
But there is demonstrable research done inside FB that indicated the harmful effects it had on society. WSJs FB Files has done a good job showcasing that research. There are knobs they can control that could reduce anger and thus engagement, but they have dialed them up. WSJ wrote about that too. FB philosophy is more engagement, however they get it.
nefitty · 4 years ago
I don't think that's true. HN is pretty non-toxic, from my perspective at least. Reddit is tolerable to me.

I think the problem is misalignment of incentives. If I'm incentivized to increase engagement, then I can think of some pretty ridiculous shit to say that will get lots of people clicking downvote and shouting at me.

I think these platforms could have been more proactive in setting those cultural norms that evolve into fruitful social interaction.

actuator · 4 years ago
Have you been to any thread which mentions say China, India or several other topics?

You will see a lot of toxicity and you can find several deleted or heavily downvoted comments.

As even HN front page and commenr section is algorithmic. Vote brigading happens here as well for posts and comments.

carabiner · 4 years ago
Lol tumblr did not have much right wing boomer content that I saw.
forgotmyoldacc · 4 years ago
That misses the point of the OP. Tumblr (as a general social media platform) had incredible amounts of bullying and toxicity. Whether its from the left or right is a moot point.

Example found after 10 seconds of Googling: https://www.vice.com/en/article/3da838/an-attempted-suicide-...

d0mine · 4 years ago
Said a person crying Fire in a crowded theater (it is a human nature to panic, but it doesn't excuse the instigator)
cronix · 4 years ago
Facebook chooses what I see while on their platform. If they didn't, I'd just see a chronological feed of my friends posts that I chose to follow as they came through without any external filtering. Going directly to friends walls shows that is not the case.

Instead, they amplify emotionally based content that they think I will react to (engagement) by studying previous interactions and don't show me things they don't agree with (censorship) even if it originated from an authoritative primary source. That doesn't sound like it originated in society, but more of a purposeful curation imposed on users, who have to conform if they want to stay. I didn't.

scyzoryk_xyz · 4 years ago
Yeah I don't think someone in that role from fb is particularly qualified to talk about society and human nature in relationship to social networking.

It's like listening to someone who builds, designs and optimizes production lines in cigarette factories philosophize about why people smoke and whether it is their free choice to do so.