Readit News logoReadit News
Barrin92 · 4 years ago
I think Doctorow is largely right about how this uncritical acceptance of tech "magic powers" actually strengthens them, the 'rather evil than incompetent' argument works. As he alludes to in the piece Karp and Thiel used to actively push for PR that presents Palantir as this sort of omniponent surveillance super-tool because it actually makes it seem more intimidating.

Criticism of 'precrime' tech or robo judges should be directed towards how goddamn stupid it is to direct machine learning algorithms that were made for perception at tasks that involve complex cognitive and ethical human judgements.

Cambridge Analytica was a good example of this as well. The press fell over themselves to characterise their technology as some sort of election tipping, mind controlling super tool. They probably loved it. In reality there's not even any scientific backing for their targeting and it most likely did barely anything, they just wanted to look like James Bond's Spectre.

edit: another thing the post made me think of is how much the tech sector loves Yuval Noah Harari. He once said he's surprised by it but I'm absolutely not, because he basically constantly tells them how omnipotent and all-powerful they are.

There's actually a fantastic scene in The Young Pope about this (a satirical show about the internal workings of the Catholic Church)

> Do you know how many books have been written about me?

  > Seventeen.
> Eighteen. The last one's going to press next week, and it's got the best title of all.

  > Which is? 
>The Man Behind the Scenes. I suggested it myself.

  > Who wrote it?
> Manna, that leftist reporter.

  > That means it'going to be critical of you, Your Eminence.
> Of course, those are the best. They turn you into a legend.

paox · 4 years ago
Dude you misunderstand.

So does Mr. Doctrow.

Everyone who has shelled out cash for marketing and advertising knows its crap. People who work in adtech know its crap. Politicians know. CEOs know. They pay the bills.

BUT its better than the previous crap and all other alternatives thanks to more data. Which is why anyone who want to influence, persuade or control others sits 24*7 on social media.

When it comes to the Game of Persuasion if everyone is using crap tools who ever uses it more has an advantage. Momentarily atleast untill someone else has more energy or outspends you.

Attacking the crapiness of the tool wont reduce disinformation.

Billions have been spent on reducing the crapiness and its unsolvable, because you never reduce the number of mindlessly ambitious chimps, who will use whatever crap tools exist, to gain influence, persuade and control others.

simonh · 4 years ago
Right, the point is you have to be in the game. If you were to pull out because the game is crap and ceded it to your opposition, it wouldn't be crap anymore. They'd dominate the channel and own the signal. Better to pay your money, keep in the game and spoil it for everybody than let anyone else win.

Also Corey correctly identifies, but dismisses the real power of FB and similar tech, which is connecting people with stuff they're interested in.

"Seen in that light, “online radicalization” stops looking like the result of mind control, instead showing itself to be a kind of homecoming — finding the people who share your interests, a common online experience we can all relate to."

We know this. Algorithmically tuning your feed into torrents of stuff you already like is the issue. He is right that yes we can relate to this. The reason we use search is to find stuff we know we want more of, but being dumb chimps the uncomfortable fact is we don't always know what's good for us. Hence Oxycontin, it feels great until you depend on it to feel anything at all, and then you're dead. Or slagging off humans with vaginas for being TERFs. Or storming the capitol building chanting "shoot him with his own gun".

The problem then is, who gets to decide what's good for us? Maybe it's our right to die or become brainwashed however we choose. I don't know, I've got no easy answers for you here.

decasteve · 4 years ago
> Cambridge Analytica was a good example of this as well. The press fell over themselves to characterise their technology as some sort of election tipping, mind controlling super tool. They probably loved it. In reality there's not even any scientific backing for their targeting and it most likely did barely anything, they just wanted to look like James Bond's Spectre.

Reading what’s still out there about Cambridge Analytica ignores the findings of the UK’s Information Commissioner which indeed found they were basically throwing data at scikit-learn hoping for a result — the script kiddie version of machine learning.

How many of the journalists gone back to correct their misinformation on the topic? Many of the articles are still out there in their original form and those hacks are still making hay on the subject.

long_time_gone · 4 years ago
== How many of the journalists gone back to correct their misinformation on the topic?==

What was their misinformation? It seems like they were right about Cambridge Analytica’s goals and tactics. You are claiming their technology wasn’t great, but that doesn’t change how they obtained it or what capabilities they were selling.

marcus_holmes · 4 years ago
Remember, there's also a powerful tendency to absolve people of the consequences of their actions by painting them as victims of powerful corporate mind-control.

If you eat fast food for every meal, is your obesity your fault, or are you a victim of food corporations' incredibly effective advertising?

If you smoke a pack a day, despite all the public information that smoking is harmful, is that your fault or the fault of the tobacco industry's advertising?

If Doctorow is right, and advertising is ineffective, there's a lot of people who will have to accept that their situation is a product of their own actions. For a lot of people, this is "victim-blaming" and unacceptable. It's a lot easier to have large corporations being evil powerful baddies who control us and therefore absolve us of responsibility for our actions.

harpiaharpyja · 4 years ago
There's no way to have any power in a situation without taking the perspective that your actions shape your situation though.
remarkEon · 4 years ago
Kind of off topic, but that show (The Young Pope Season 1) is very very well written, and is a good commentary on how to understand culture and realize how keeping things mysterious and hidden is a way to seed obsession. This scene[0] in particular is a good one.

[0]https://www.youtube.com/watch?v=hY8C3cIMR4o

godelski · 4 years ago
They say to never attribute to malice what can easily be attributed to incompetence or stupidity. I think the extreme version of this is when emergence through stupidity appears to be a global conspiracy of super rich/geniuses. Humans do love stories and legends. They distract us from the truth which is that the world is exceedingly mundane.
philipov · 4 years ago
You speak of Hanlon's Razor. There is a corollary called Grey's Law: "Any sufficiently advanced incompetence is indistinguishable from malice"
qsort · 4 years ago
Yes, I totally agree.

I guess it's human nature to prefer attacking on a moral angle as opposed to a pragmatic one (even though the latter is usually more effective).

We see this all the time even outside the tech sector, people will cry "omg you're such a racist" sooner than "you don't even understand numbers, you bozo".

grasshopperpurp · 4 years ago
I think it's clearly counterproductive to shout racist at everything; however, people who don't understand the numbers don't (generally) use numbers to define their stances. They choose what to believe in based on other beliefs, and they justify them with funny math.

The thing I like about your insult vs "you're racist" is that you're attacking what someone is doing, rather than what they are. When confronting someone for being wrong, it's a lot more constructive to say, 'Hey, you're wrong about this, and this is why,' than it is to say, 'You're a moron (or a bad person), and this is why.' It's also more accurate. Labeling is helpful for navigating thousands of people in our communities/millions online, but keep those labels to yourself, and be ready to question them. During confrontation, though, it will cause the other side to shut down, and that's where we are. We're more concerned with defining each other as others than working together to find commonalities and building on points of agreement.

I'd just add that significant power imbalance significantly changes things and how we ought to attack disagreement.

planet-and-halo · 4 years ago
Aside from how absolutely sad the humorlessness of society these days is, it's also a catastrophe that we've taken humor out of the critical tool belt. H.L. Mencken has a great quote about it, something along the lines of the most good being done in the history of man by people throwing cats into temples.
boldslogan · 4 years ago
the criticism of C.A. do you have any further reading? i recall a podcast where they interviewed one of the founders ( his name was something comically evil like last name Spectre) and he was saying it was just consulting mumbojumbo and it didnt work. but now I cant find it.
bo1024 · 4 years ago
I think Cory misses a really big point here. He seems to assume that the source of disinformation is ads, such as political advertisement. This implies that advertising works if and only if disinformation works.

Of course, in reality disinformation goes through many channels on facebook, but (my understanding is) a ton of it is organically shared and commented material and links. I.e. a different kind of content from ads altogether.So facebook ads could still be ineffective, yet disinformation campaigns highly effective.

I'd also discount the studies on fake news from Spring 2017. The game has changed significantly since then.

My impression is a bit different from "criti-hype": I think Facebook is likely extremely good at promoting engagement, pulling attention, and changing minds -- HOWEVER -- FB has very little control over what content is able to be successful in this environment they've created. It takes an entire web ecosystem (quasi-news sites to host 'articles', etc etc) to conduct these disinformation campaigns and they have to resonate with people organically interacting, not just popping up in ads.

The fact that FB can't at-will divert all this attention onto breakfast cereal or brake fluid or whatever doesn't mean the disinformation is ineffective in the political sphere.

Edit: it's one thing to build an extremely fertile petri dish, it's another to control which organisms end up colonizing it.

88913527 · 4 years ago
Is there an appreciable difference between paid channels for disinformation and non-paid? Just yesterday on this forum it was discussed how 19 of the 20 largest Christian pages were Russian troll farms [0]. You don't have to be paying Facebook to craft a public narrative. Admittedly, it does take some capital investment creating the content farm: a small army of modestly paid office workers abroad.

[0]: https://news.ycombinator.com/item?id=28691585

boomboomsubban · 4 years ago
>...were Russian troll farms

From that article

>These groups, based largely in Kosovo and Macedonia

podiki · 4 years ago
While I partially agree with there being a difference between just ads and what we're seeing happen, I didn't read it as being a 1-to-1 correspondence. Instead, it was pointing out the perhaps limited influence of...influence. In that maybe this isn't really "changing" people but highlighting them, giving them a platform, letting them congregate. Surely it attracts people that may not believe something yet and are susceptible, but the argument seems to be more of amplification and noticing this, rather than creating it as much. In other words, remove FB and you'll still have the same beliefs, just not as obvious to the outside. At least that was how I read some of the article. (I do think FB and the like are making things worse, but does it just look like that because we can see it better? or have the numbers grown rather than just each gotten more bold?)
raxxorrax · 4 years ago
The political split we mostly see are that of partisan reactionaries. Not the disinformation itself is powerful, it is that it is seen as manipulating millions of people and some groups argue to remove that content. I think this is the fitting anology for the article. The act of censorship gives misinformation its power. Not because people truly believe them.

Nobody did take Alex Jones selling man pills seriously. Banning him turned him into a martyr and confirms his theories that he is enemy of the state nr. 1. A self fulfilling prophecy.

tdeck · 4 years ago
People took Alex Jones seriously enough to harass the parents of children who were killed in school shootings. It's that kind of behavior that got him banned in the first place.
prancer_or_vix · 4 years ago
The political divide is because the institutions of power are winning.

It will be and always has been the goal of those with power to deflect attacks directed at them, usually to other groups who are attacking them. For all of his faults, Biden has been extremely good at convincing the lefties that "well, we'd be back to normal now if those damn Republicans would just get vaccinated", effectively turning the normal folks on the left against the normal folks on the right instead of where the anger should be directed: toward the institutions that let millions of folks go homeless, financially and medically insecure, etc. in the face of the pandemic.

The Republicans in power are, I think, a little better at playing this game: they whip their base into a furor over the group in power but only in such a way as they (those already with power) still come out on top. The people are angry at the right group, but for the wrong reasons (and really just to further the agenda of the _worst_ people).

dTal · 4 years ago
The converse observation also holds - if people didn't take him seriously, he wouldn't have been important enough to ban.

The fact is that people do take him seriously, and it's a huge problem. You can ignore the problem, but the problem won't ignore you.

sufehmi · 4 years ago
Indeed most of disinformation in Indonesia is via contents not facebook ads.

But there's still problem with Facebook - echo chamber algorithm, it traps people in their own bubble world

I once tried to like just funny posts. In just a week, nearly all contents in my timeline are jokes and funny posts

Now that's scary.

square_usual · 4 years ago
One of the biggest and most common attacks against Facebook is that its algorithms are what drive misniformation so much on its platform; if those algorithms are so powerful, then why would Facebook's ads algorithm be so much worse?
bo1024 · 4 years ago
Yeah, that’s the question. My attempt at an answer is that they can set up a powerful environment for disinformation to thrive and influence people, it doesn’t mean they can use it to push any random product with the same level of success.
88913527 · 4 years ago
If there isn't a "lying flat" movement for social media, there sure should be one. It characterizes my feelings perfectly: I give up on it. Nearing 1 year without Facebook, I can confirm it isn't essential and your strongest network connections will contact you if they wish to.

If Facebook could exist as it did one decade ago, maybe it'd be different.

dade_ · 4 years ago
It’s been over 7 years since I deleted my FB and I’ve never had Instagram. Their IP address scopes are blocked in my home. I do miss the now ancient FB, before they destroyed it with algos controlling what I got to see. That is when I got rid of it. Every so often someone will show me some hatred spewing diatribe someone in their family posted, and all the people piling on and I am happy being unaware. I get my quasi-news, conspiracy updates, and assurances that the world is about to end direct from Zerohedge.
throwawaysea · 4 years ago
I can barely remember the old, authentic social media at this point. That legacy has been completely drowned out by how dark and angry the world has been made by a few monopolistic tech companies squeezing out market share, engagement, and profit at all costs.
geoduck14 · 4 years ago
> I do miss the now ancient FB

Do you remember when the bottom had a quote "too close for missiles, I'm switching to guns"? My friends and I wondered what that ment for a long time.

bmarquez · 4 years ago
> I do miss the now ancient FB

I miss the "Random Play" interested in field. The modern incarnation is Facebook Dating which requires the mobile app, so I'll never use it. I also suspect they use the attributes to target ads (e.g. shirts for tall men if you fill out the height field).

sloshnmosh · 4 years ago
+1 for the zerohedge comment.

A few months ago something happened over at Zerohedge.. The site became unviewable with JavaScript disabled so I stopped going to the site. But it looks like they’ve had a change of heart and can now be viewed with JavaScript disabled.

But it also seems like the content has changed from what it used to be.

Did they change ownership or is it just my imagination from taking a break from their site when it became dependent upon JavaScript?

m_ke · 4 years ago
Yeah also deleted my account a long time ago and blocked their IPs. Now I just need to get rid of 90% of my code, which uses pytorch, react and react native...
VonGuard · 4 years ago
100% agree. Leave Facebook and social. IRC is much better. Frankly, context is the issue. My group of friends should not intersect with some other random group of friends. Content should remain within its context. Sharing to everyone is the worst thing possible. It just causes huge conflicts from different contextless people clashing over content the original poster usually considers harmless. Like, if anti-vaxers stayed in their own little groups, it might not be such a problem, but their content gets posted to everyone in everyone's networks, and the meme virus spreads, lacking the important "the people saying these things are fucking idiots" context.
sk0g · 4 years ago
This is pretty much what Google+ had/ failed due to.

If only relevant content is shared within a 'circle', people appreciate it, sometimes ignore it, and move on. Consistent controversy and drama is avoided, which just so happens to be a key driver of social media engagement.

bmarquez · 4 years ago
> IRC is much better.

IRC is too technical for many people I know, but they are willing to use, or are already using Discord.

sasaf5 · 4 years ago
IRL is even better :)

Deleted Comment

Deleted Comment

dreen · 4 years ago
Ive quit FB about 10 years ago but Im considering coming back because I keep hearing about the value of Groups. Sure its just mailing groups with UI, but thats where the people seem to be.
rendall · 4 years ago
The concept of disinformation as it's used today, and Doctorow touches on this in his essay, is itself infected with bias. We all can point to true and correct information that was at one point or another labeled as disinformation when it actually only was inconvenient to one or more powerful actors.

Personally speaking, the label has come to mean to me something other than "false information, intentionally spread". To me it has come to mean "something that could be false, but someone definitely doesn't want it talked about".

pharmakom · 4 years ago
My understanding is that disinformation is information that is deliberately spread that is known to be false to further some agenda. Misinformation is where false information is spread without that intent.

From this definition, what’s interesting is that you can only identify disinformation by knowing 1) what is actually true and 2) the motivations and methods of the person spreading it.

I can think of two good examples of disinformation:

1. When oil companies denied climate change despite their internal research

2. When tobacco companies denied the importance of nicotine in their products

CaptArmchair · 4 years ago
It's more then that. It's not whether information is "true" or "false". What matters is the "falsifiability" of statements and assertions shared online in the first place:

> In the philosophy of science, a theory is falsifiable (or refutable) if it is contradicted by an observation that is logically possible, i.e., expressible in the language of the theory, and this language has a conventional empirical interpretation.[A] Thus there must exist a state of affairs, a potential falsifier, that obtains or not and can be used as a scientific evidence against the theory, in particular, it must be observable with existing technologies.

https://en.wikipedia.org/wiki/Falsifiability

A lot of disinformation boils down to making a claim that can't readily be falsified while defending their stance by using logical fallacies to deflect criticism, such as shifting the burden of proof to the other party.

In the past 1.5 year, a lot of claims made about the pandemic simply couldn't be falsified for lack of technology or methods to observe a potential falsifier. That doesn't means those claims are readily false or true. It means they are plausible or probable. Which are very different qualifiers regarding veracity. A typical example is the heated debate about the efficacy of masks and mask mandates.

Access to information has also demonstrated a down side. Research studies have been widely quoted / cited within the public debate to support all manner or arguments. However, by pulling studies out of context, they lose their value. Many tried to support their claims by cherry picking, ignoring scientific publication process (e.g. peer review) and so on. The net result is scientific research, regardless of it's veracity, re-purposed as token support for confirmation bias in the public debate via thinly veiled appeal to authority.

The relevant questions regarding disinformation isn't the veracity of statements. But rather: Who are you anyway? Where are you coming from? And why are you making these claims in the first place?

sk2020 · 4 years ago
3. When pharmaceutical companies denied the risk of novel gene therapies for inoculation against a family of cold viruses.
notacoward · 4 years ago
Unfortunately, the most strident voices saying that any term is overused are usually those to whom even strict definitions apply.

"I'm not anti-sneetch, I just..."

Assuming that people use a term because of their own agenda is not helpful IMO. Maybe they actually do believe the information was false, and that it was spread deliberately. Why not give them the benefit of the doubt that you insist on yourself?

prox · 4 years ago
I would go further and say it is the exploitation of emotion to whip people into taking opposite stances (or a wanted stance) and thus hindering any dialectic or reconciliation process. We all can see how this campaign is a 24h show so there is never any respite to cool down and take a breather. Disinformation is a tool to those nation states or political parties who have the power to wield it.
int_19h · 4 years ago
This video will make you angry: https://www.youtube.com/watch?v=rE3j_RHkqJc
isaacremuant · 4 years ago
Absolutely. I find it horrific how, due to politics, we've moved all the way from "censorship is bad" to "we need to stop disinformation" which clearly implies soft or hard censorship. It also heavily leans to whatever political biases the company has and people think that's perfectly fine because they're getting one over their political opponents.

The end result is that lies become truth and truth becomes lies because dogma spread by tech companies (lobbied by other companies or political groups) is what is pushed and accepted while dissent is punished in varying degrees.

All in the name of "the greater good", non ironically. Seems we learned nothing from fiction or history.

smileysteve · 4 years ago
It has never been "censorship is bad"; it has always been "government censorship is bad"

Never has a private newspaper been required to publish every comment or opinion that they receive despite its inaccuracy or obscenity.

pydry · 4 years ago
Warnings presented alongside content arent censorship.
nonameiguess · 4 years ago
There is a lot of scope creep in how this has evolved over the past five years. Early in 2016 when "fake news" first started being widely worried about, the Washington Post did an expose interview of two guys in Long Beach, CA who had created a series of fake newspapers, not staffed by any real reporters or writers, where they wrote all the articles themselves under fake names, and just completely made up both the headlines and the content based on what was generating the most clicks. It largely ended up as political content because it happened to be an election year and that sold, but it was strictly made up. They made no attempt at all to even care about real events. This was very unambiguously fake news, and it wasn't uncommon for sites like this to pop up out of nowhere because it was cheap money for unscrupulous types to just feed the outrage machine and profit.

But two things happened. First, something that explicitly fake is actually pretty easy to identify in an automated fashion and thus easy to root out once you know you're looking for it. Two, Trump himself took the word after it became apparent that much of the suckers being suckered into clicking on this crap were his strongest supporters, and completely poisoned the well by calling anything and everything he didn't like "fake news" as if an actual professional reporting operation at least attempting to do real investigations and interviewing real people was equivalent to two guys in an apartment making up headlines completely from their own imaginations.

Disinformation campaigns are somewhat orthogonal to that and a thornier issue. We're conflating many things. Some are just larger troll farms that are exactly equivalent to the two guys in an apartment, but operating with professional backing and financing in a more sophisticated manner. But they're just trying to make money. Some of what they say might actually be true. They don't care and aren't trying to push any particular agenda. But they're also not trying to produce true information and do so only incidentally if they do at all. At least some of it is actual foreign intelligence services conducting psyop campaigns, which definitely do care and are trying to push a particular agenda, but it's just to sow internal discord. They don't really care about one side or another of a contentious debate and will gladly inflame the worst actors of both sides. But these are being conflated with likely sincere people and organizations who are doing some level of real investigation and actually believe what they're writing, whether or not it's accurate. These are all umbrella'd under the "disinformation" term, but they're not the same thing.

Having Covid happen destroyed all hope of cleaning this up, too. We're now in the middle of an important global event affecting nearly everyone, but about which there are little in the way of facts, and we certainly can't know with certainty what an optimal course of action as a response was or should have been. We'll never be able to observe counterfactual worlds. Many important facts will only become known years in the future, and some will never become known at all.

square_usual · 4 years ago
E: The tone and direction of this comment would've redirected from discussion of an otherwise interesting piece, so I've changed it.

You can also read this on Cory's own website: https://pluralistic.net/2021/09/30/dont-believe-the-criti-hy...

or on archive.is: https://archive.is/iAsXZ

(My apologies to people who responded to the earlier version of this)

throwawaysea · 4 years ago
Thanks for the reminder; I forgot he has the same posts available at pluralistic. Unfortunately I don’t see a way for me to edit the link.

I agree Medium is not a great platform. Dark patterns aside, I was also disappointed to see their heavy censorship early in the pandemic, which kept many interesting analyses and speculation from being shared.

TechBro8615 · 4 years ago
Does anyone know who runs archive.is? I imagine they’ve got some good data on news readership.
1vuio0pswjnm7 · 4 years ago
Search engine cache, another way to avoid medium.com website

https://webcache.googleusercontent.com/search?q=cache:https:...

bumbada · 4 years ago
I believe that Doctorow is missing the most important issue here.

The elephant in the room is the privatization of your social space and that Facebook is a private spy agency. That is what makes FB extremely dangerous.

The fact that FB is private, efficient and full of technically brilliant people is not good for the rest of society, because the society can not imagine what those brilliant guys could do with their data.

For example, I go to a party with friends. I don't have facebook, but one of my friends have it. He takes a picture of the group and posts it on FB.

Now the fact that I went to the party, the place of the party, the time, the people that went to the party, the private opinions of some friends about other friends, girl-boyfriends or lovers(if they used FB channels like wassap to communicate) has been recorded for the rest of my life and can be accessed by American spy agencies like the NSA.

Society expects FB will forget because they as humans do forget, but machines never forget.

People talk about a crush on someone else with a friend and expects nobody listening(on wassap), but a bug is always on, recording everything they write or say.

FB is like Stasi, only way more efficient and that applies to the entire world population instead of just one country.

And they are constantly pushing the boundaries of privacy like using "smart glasses" so espionage is full in your life.

And they are constantly improving their AI techniques so even when they could technically not do something today does not mean they could not do in five years with the data that they already collected about you.

slivanes · 4 years ago
Engagement or enragement, they still get you.

I get the feeling that visitors on Facebook are often angry and annoyed, and Facebook is fine with that.

Deleted Comment

LurkingPenguin · 4 years ago
I always get a small chuckle when I read an article about how horrible FB is. Not because it isn't horrible. It literally is the digital devil. But because 90% of those articles have a FB share button on them. As this one does.