Readit News logoReadit News
olladecarne · 5 years ago
"Show me the incentives and I'll show you the outcome."

This whole ad driven attention economy is probably as dangerous of an invention as nuclear weapons. We're caught in a state of psychosis and individually none of us can do anything to fight the billions of dollars spent on advertising that tries to make us feel inadequate. Targeted ads and recommendation engines need to be banned. I know some people say they like them, and some people also say they like tobacco but we generally agree it's bad for us. Companies will find other ways to innovate. Content curation, organization, and quality will become more valuable and eventually the experience will be better.

I say this because half-hearted measures like the one in the article are not going to make any difference when the entire business model of the internet is clickbait.

kanox · 5 years ago
> Targeted ads and recommendation engines need to be banned.

I don't see how you can "ban ads" without instituting totalitarian dictatorship.

Targeted arguments are a core part of political campaigning and polarization has been increasing in the US for decades, even before the internet.

My pet solution would to switch to a multi-party system that isolates rather than amplifies fringe voices. In germany people can look at what AfD and Die Linke have to say without having to choose between them.

Deleted Comment

LackOfImg38 · 5 years ago
Lack of imagination is the problem.

I use a screen for coding work, HN breaks, shopping, civic life like bills, and the occasional stream binge. Outside work, maybe 5-10 hours week.

Otherwise I write creatively on paper, learned guitar, share digitally via point-to-point, and leave my house often for no reason most of the time.

Personally I have a hard time buying into anything you say being some sort of obligation, and more of a repeated hand me down habit.

sjs382 · 5 years ago
It's not that the ads are targeted, but rather what they're targeted based on.

Google's search ads used to be displayed based on the keywords being searched, rather than the user viewing them. Facebook based on what you've "liked". Amazon and eBay based on what you've bought before on their sites.

Go back to targeting ads based on page context and explicitly provided information (search queries, what I actually enter into my "profile", etc), rather than machine surveillance and inferences.

xlm1717 · 5 years ago
Individually we can do something to fight it. If you can't block the ads, learn to ignore them.
musicale · 5 years ago
Like ad-blocking, ad-ignoring is unfortunately something of an arms race. The better people get at ignoring ads (e.g. "banner blindness") the harder advertisers work to make them hard to ignore, either by making them more "attention grabbing" (e.g. modal windows, auto-playing video, etc.) or baking them into the content (e.g. sponsored posts, advertorials, product placement, etc..)
asdff · 5 years ago
Considering advertisement works on a subconcious level, ignoring them will take some serious serious mental discipline. Maybe after spending a few years in a monastery on a mountain will you be able to resist their invasion into your subconscious headspace.
prohobo · 5 years ago
Regardless of your view on "misinformation", this sounds like a stupid idea that's meant to cover Twitter's butts in the short-term.

The goal of any social platform should be healthy engagement. Where is the research that this change will foster that? Twitter isn't really the place to be testing half-baked ideas.

Also, misinformation is coming from mainstream media as well as conspiracy theorists. At least conspiracy theorists are trying to find the truth. Twitter doesn't seem to acknowledge that, so what we're doing is creating digital totalitarianism? That's their great idea?

drchopchop · 5 years ago
This is not "totalitarianism", it's the reduction of free newsfeed distribution. Twitter, and society, doesn't have a long-term interest in a newsfeed flooded with false/defamatory information (widespread voter fraud, for example)

>conspiracy theorists are trying to find the truth

Some of the biggest conspiracy theories that flourish on the web are the result of viral, provably false, information. These people are not truth-seekers, there is no proper research or scientific method - it's instead the equivalent of the supermarket tabloid taken to the extreme: QAnon, flat earthers, 5G cancer, etc.

dudeithinkBLM · 5 years ago
Ugh, those three listed are widely accepted in truth seeking community to be utter nonsense. They are ridiculed and dismissed as theories planted by shills and three letter agencies to make conspiracy theorists seem crazy. Instead substitute three conspiracy theories truth seekers actually believe: "three World Trade Center buildings fell on 9/11", "fluoride in the water lowers your IQ" and "vaccines are full of contaminants and neurotoxins". Just looking out for you bro.
prohobo · 5 years ago
This tacit acceptance of a "newsfeed flooded with false/defamatory information" when it happens to support one obvious partisan side is, frankly, something embarrassing to have to see.

I have no interest in defending the Trump administration except to point out that what you are supporting is, in fact, a hypocrisy. I can think of four examples which I know are hypocrisies that make it through the supposed "fake news" filter.

1) Russiagate, an unsubstantiated hoax promoted in the media for 3 years leading to countless defamatory attacks on Trump, and millions of dollars spent on investigations.

2) Cambridge Analytica, a tactic used by Obama's campaign which was praised in 2011 as being a new progressive way to reach voters, and then miraculously became an act of corruption when Trump did the same thing in 2016.

3) "Mostly peaceful protests", an ongoing gross misrepresentation of what normal people would call riots.

4) "Trump supporters are racist white men", an ongoing smear of the Trump administration and emotional abuse towards the Trump base (or anyone who doesn't buy into the narrative), which has been proven absolutely false by the latest election.

So which is it? You do or you don't want a newsfeed flooded with false/defamatory information? Just allow the stuff you don't personally see as false/defamatory?

This is an extremely dangerous problem, and people need to start waking up to it instead of thinking they got it all figured out, as if Alex Jones yelling at frogs is the reason everything is falling apart.

The fragmentation of reality which we're seeing (Trumpers, leftists, QAnon, flat earth, 5G, etc.) will NOT be solved by slowing down how fast people can "like" something. The entire internet is broken, and the social media platforms need massive rewiring. We need to properly research the ways the current platforms poison discourse, and find remedies that work to dissolve fragmentation and help people communicate better.

Currently, things are rapidly spinning out of control, and the social media platforms have decided to opt for totalitarianism. As they ban and hamstring everyone who doesn't buy in, good or bad, those people will find each other on the decentralized internet. This is creating a powder-keg for narrative chaos and conflict.

Dead Comment

Dead Comment

lhnz · 5 years ago
There will be tweets with tens of thousands of likes/retweets saying "Are others able to like this?" and so the information will be streissand effected to many more people.

Additionally, if the media makes the wrong call due to insufficient evidence at the time or bias, then they could take a hit to their reputation.

It will almost become tactical for the original posters to hold back evidence and wait for their "misinformation" categorisation and then to "disprove" this to make the censors look like liars.

entropea · 5 years ago
>Additionally, if the media makes the wrong call due to insufficient evidence at the time or bias, then they could take a hit to their reputation.

Does that ever happen? It seems they're exempt from repercussions.

shaicoleman · 5 years ago
The culture of any community is shaped by the worst behaviour that is tolerated.

Related: Why unmoderated online forums always degenerate into fascism

https://www.salon.com/2019/08/05/why-unmoderated-online-foru...

_v7gu · 5 years ago
Another reading of this (which I don't agree with) is that "liberal" ideas need censorship to survive. Which one is the correct interpretation is almost impossible to determine without sophisticated methods of testing.
asdff · 5 years ago
Or that fascist ideas tap into some primal root of our consciousness like a disease, latching on like an addiction that we are powerless to fight. In my mind tribalism is inherent to the human condition. Being ready to go off and violently slaughter people who aren't our kin is the very reason why our lineages exist today and why you and I have had millions of generations of parents surviving long enough to produce offspring. We are directly descended from the most violent and most tribalistic of humans, as the more peaceful lineages would have long since been killed off by our ancestors who practiced coordinated violence.

It doesn't take much to push people back into this primal level of thinking. We are predisposed to follow a charismatic leader, predisposed to follow a religion and adhere to it without any weighing of its systems with that of other religions or thinking critically about our beliefs, predisposed to fear the unknown 'other' rather than welcome them and their ideas. Millions of years of selective pressures have created who we are and how we behave toward one another, it's no surprise that there are some serious growing pains toward adapting to this new world where we attempt to treat others as equals rather than threats. Same sex marriage was only legalized five years ago in the US, after all.

betwixthewires · 5 years ago
But this is historically inaccurate. The issue of forums becoming hateful echo chambers is a new phenomenon, not a time tested one. Digg wasn't a terrible hateful echo chamber, reddit was not, most topic oriented forums are not, lots of alternative sites springing up nowadays are not, Facebook, MySpace, the list goes on.

No, the phenomenon that lasted really only a few years was due to the fact that the first Diasporas kicked off of those sites (or that left due to crack downs on their expression) were existing hateful communities. There's a name for the effect that I can't quite remember, but basically new communities are unsavory at first because they're made up of the unsavory characters that are unwelcome at the other communities, and this prevents their growth.

Thankfully we are currently seeing that effect wind down as well. With the increase in moderation beyond plainly hateful content, now to anything a site deems unlikeable, you're seeing those Diasporas become less and less unsavory and more mainstream sets of ideas are being discussed on the newer forums.

I regard articles like the one you linked as yellow journalism, designed to discredit people's desire to collect and discuss ideas online, freely, and argue in favor of places where ideas cannot be discussed freely. If there is any one thing that causes sites to become hateful, it is the need for the site to promote "engaging" content for as profit, something you are less likely to see on newer sites with less commercial pressure.

nemothekid · 5 years ago
>Digg wasn't a terrible hateful echo chamber, reddit was not, most topic oriented forums are not

Those platforms, from the topic oriented platforms, to digg & reddit all stressed moderation. What's new is this idea that any sort of moderation is an infringement of free speech and content platforms should moderate as little as possible. Moderation used to be far stricter when platforms were smaller.

croon · 5 years ago
I've been around since BBS:es, and I can tell you that the success in avoiding hateful cesspools was very much linked to heavy moderation. The internet isn't very old in the grand scheme of things, but what you call "historically inaccurate" has been accurate for as long as there's been people on the internet.
giuliomagnifico · 5 years ago
Sounds good to me. Twitter is not really slowing down the like action, but it will say to you only “hey, this may be a fake news, are you sure you want like it?”. People are stupid, in a perfect world this is useless but we live here...
OneGuy123 · 5 years ago
Ironicaly you missed the most important point: who decides what is fake news?

You still don't get it that _NO ONE_ can claim they know what is/is not fake news because everyone has their own incentives.

onion2k · 5 years ago
You still don't get it that _NO ONE_ can claim they know what is/is not fake news because everyone has their own incentives.

This is precisely why it's important that Twitter calls out when things are stated as fact without evidence. Twitter never actually say things are fake; they say things are baseless and without evidence. If you want to post things that are influential you simply need to back that claim up with something that people can verify from a trustworthy source - 'fake news' will still be posted because sometimes even a trusted source gets it wrong, but it'll happen far less often. That's the goal.

Suggesting that we should all adopt Nietzsche's perspectivist approach where "there are no facts, only interpretations" is entirely unhelpful. You can't run a functioning society if you have to accept literally every batshit mental theory as "well it might be right, we can't ever know for sure". If there is no evidence, you can say something is fake. You just have to accept that maybe 0.1% of the time you'll be wrong.

iso1631 · 5 years ago
https://metro.co.uk/2020/11/03/fake-robocalls-give-us-voters...

> Officials in Michigan reported on Tuesday that citizens of Flint, a predominantly black city, were receiving calls telling them to vote Wednesday, and not on election day. The calls are now being investigated by the FBI.

That is misinformation, no ifs, no buts.

entropea · 5 years ago
I agree with this for the most part.

In the early 2000's Americans were lied into the Iraq war, with multiple newspapers practically begging [0] for war, and critics were the ones on the correct side of history. If that happened in 2021, would the critics be silenced, have warnings on all of their tweets, and be told that they're supporting conspiracy theories? Questioning the official narrative of power is becoming wrongthink.

Who's deciding what's wrong and right?

[0] https://www.washingtonpost.com/archive/opinions/2003/02/06/i...

kristiandupont · 5 years ago
>You still don't get it that [...]

Have you been arguing with OP before? Or is this just how you address people in general? It doesn't set a very nice tone.

kanox · 5 years ago
> Ironicaly you missed the most important point: who decides what is fake news?

In this case, twitter. And it's fine to disagree with the fact checkers.

RickJWagner · 5 years ago
I agree.

Example: The famous photo of Anderson Cooper standing in a ditch, pretending as if there was a catstrophic flood. (Really, the water was only a few inches deep.) Would Twitter flag that one?

It's a given they'd flag Trump a bunch. How about Joe Biden saying he would not ban fracking? Would Twitter flag that one?

Twitter is going to have a very hard time making anybody happy with this idea.

Dead Comment

jmull · 5 years ago
I’ll just point out: if you’re a free speech purist, you are perfectly OK with this.

Twitter isn’t removing these tweets, just adding their own messages. That is, they are responding to free speech with their own free speech.

If they use this feature on high-profile figures it’s even better: free speech to authority.

betwixthewires · 5 years ago
That's some gymnastics if I've ever seen it.
jmull · 5 years ago
Please explain.
illuminati1911 · 5 years ago
Logical fallacy.

Twitter is not free speech system to begin with so ”free speech” purists are not obligated to be ok with this unless the whole platform is based on free speech. In that kind of system any user would be able to prompt any popup for any user when they are about to like any tweet. Now only twitter has that power.

rbecker · 5 years ago
Do you think it's possible to hold an opinion other than "this should be illegal" and "this is OK"?

Do you think even free speech purists can see the danger in allowing too much influence concentrated in too few hands?

kanox · 5 years ago
> They are responding to free speech with their own free speech.

Yes, this is unironically a good thing.

It means that Twitter is acting as a publisher rather than a neutral channel but nobody is willing to pretend that corporations are neutral anyway.

aww_dang · 5 years ago
Just to put it in a different perspective, can you imagine using an IRC channel where a bot automatically applied fact checks to your statements?

Doubt that would've gone over well.

asdff · 5 years ago
That would have prevented a lot of flame wars
aww_dang · 5 years ago
Objective truth as revealed to us by the anointed technocrats and fact checkers has been accepted by partisans where it suits their biases. Questioning these official truths amounts to blasphemy in the social media space.

The premise of democracy rests upon the concept of free and open debate. If we cannot trust the public to consume information without hand holding, why should we trust them to vote on issues which impact our lives and property? Ironically, censorship is enacted in the name of protecting democracy.

Isinlor · 5 years ago
True censorship is when government does it. If you go to jail for promoting some information, that's censorship.

Twitter and Facebook are part of "the public". You are even using right now a website very heavy on "censorship", or how I call it "moderation". If you don't follow HN guidelines you will be silenced. The efforts by members of public to reduce spread of misinformation and polarization is part of why we should trust in public to handle information. But if you do not like HN, Twitter, Facebook you can look for another website.

Apparently Parler is getting recently popular as an alternative to Twitter and Facebook moderation: https://en.wikipedia.org/wiki/Parler

input_sh · 5 years ago
Try posting anything remotely leftist on Parler and you'll be banned in less than a week. My profile lasted about 36 hours.

Completely the opposite experience of what quite literally is the first sentence on their homepage:

> Speak freely and express yourself openly, without fear of being “deplatformed” for your views.

betwixthewires · 5 years ago
Censorship is the hiding of information and prevention of publishing. that's it. That's the definition.

HNs guidelines appear to be "put some effort into what you say and don't fight." They're not censoring any particular set of ideas.

rorykoehler · 5 years ago
I agree and disagree with this simultaneously. I don't think they should add restrictions like this however I will paste a tweet I tweeted today of what I'd love to see from Twitter:

>I'd love a filter where Twitter only shows real people who have verified their ID with a passport. No companies, no bots.

kanox · 5 years ago
It's not censorship if the post is still accessible. It's Twitter inc. expressing their opinion.