Readit News logoReadit News
AlbertCory · 3 years ago
The attention economy corrupts everything it touches: not just science, but journalism, politics, and even childhood.

Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird, being the guy who dove into the river to save a drowning child, for example), but for the most part, you were going to live your life known only to the few hundred or thousand people you met personally.

Even though you could pick up a phone and dial anyone in the world who owned a phone, you wouldn't, and if you did, they'd hang up on you. Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.

Is that a good or a bad thing? It's certainly bad in some ways, and this is one of them.

NL807 · 3 years ago
The problem seems to be more fundamental. Attention is not inherently bad, the issue is how and what kind of attention is rewarded. Many platforms reward engagement of attention seeking behaviour, both good and particularly bad ones, as it easily evokes primal emotions in the audience. And so there is incentives for content creators to continue peddling shitty content.
meowkit · 3 years ago
...it is more fundamental.

It's Moloch - the optimization for one criterion (https://slatestarcodex.com/2014/07/30/meditations-on-moloch/) on the grandest scale. It's attention in the "attention economy", but attention is just signal for revenue opportunity, which is profit extraction aka the optimizing measure for Capitalism.

Too be clear, I am not even against Capitalism. Its been a powerful tool for driving market economies, and in conjunction with social justice it has risen all boats. The problem now is that it's been eroding foundational societal elements that make contemporary society (and Capitalism itself) possible in the first place... like attention, science, community, political sense-making, and more.

This is different from, but related to, Goodhart's Law (https://en.wikipedia.org/wiki/Goodhart%27s_law).

user3939382 · 3 years ago
The thing that scares me the most is that unlike reading a novel, or pensively writing a letter in the 19th century, we focus much less now on one thing for long periods. Even looking at HN, but look especially at TikTok, YouTube “shorts” — the dopamine pinball game going off in our brains and constant change of focus is robbing us of a skill which I fear will have unforeseen consequences at the scale of our global society.
lkrubner · 3 years ago
And yet, in my career, I've noticed the rewards are increasing for being the person who is willing to focus on one thing for a long time (for several weeks, or months). For instance, I've never been the kind of software developer who could write obviously clever code. But I have written code that was admired and praised, and sometimes seen as the salvation of the company I was working for -- but not because I'm especially skilled as a software developer, but only because I was willing to think about specific problems, deeply, for longer than anyone else at the company. In 2012/2013, to the extent that I helped re-invent the tech stack at Timeout.com, it was because I was willing to spend weeks thinking about exactly why we'd reached the limits of what we could do with various cache strategies, and then what would come next. I then introduced the idea of "an architecture of small apps" which was the phrase I used because the phrase "microservices" didn't really become widespread until Martin Fowler wrote his essay about it at the very end of of 2013. Likewise, I now work as the principal software architect at Futurestay.com, and my main contribution has been my willingness to spend weeks thinking about the flaws in the old database schema, and what we needed to do to streamline our data model and overcome the tech debt that built up over the 7 years before I was hired. We live in a world where there are large economic rewards for the kinds of people who are willing to think about one thing, deeply, for weeks and weeks or even months and months, until finally understanding a problem better than anyone else.

I have to hope some young people eventually escape the attention-sucking technologies that try to sabotage their concentration, and eventually discover the satisfactions of thinking about complex problems, continuously, for months and months and months.

bawolff · 3 years ago
Pretty sure being famous is still pretty damn hard. For every tick tock virally famous influencer there are thousands of wannabes that nobody cares about.

> Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.

I'm sorry, did a hand reach out of your phone and force you to install and view the app of the week?

godelski · 3 years ago
Still hard but I think the parent is arguing that there are more famous people today than say 50 years ago. One of the things the internet did is create more niche groups. There's way more B list and C list celebrities than ever before. Being an A list is still very hard, but being a celebrity in general is easier. Plus there's all the one hit wonders and many more of them. Even the "runs into burning building" famous people are much more likely to become known across the country or globe than before
Retric · 3 years ago
Being famous doesn’t require my personal attention, just the attention of people who actually downloaded the app of the week.

The barrier to being famous dropped as people spend less time on any one thing. A few seconds of attention is qualitatively different than reading a novel or even watching a movie.

omegalulw · 3 years ago
I don't understand this attitude. Who is the judge of what ideas are great and what ideas are idiotic?

Dead Comment

nopenopenopeno · 3 years ago
>Now you can force your idiotic, or great ideas onto the screens of millions of people you'll never meet.

No you can’t. Attention is scarce. That’s why it’s called “attention economy”. In fact, it was far easier to force idiotic ideas onto the screens of millions of people before the internet came along. That’s what TV commercials were.

NL807 · 3 years ago
I think you're thinking about corporate/org-scale attention seeking. Putting things into historical context, getting attention today is certainly easier for a random individual. In fact anyone could directly compete with large organisations and even surpass them in some cases.

You can set up a youtube account then do episodic dumb shit, and i can guarantee you'll get thousands of not millions of views eventually. You simply couldn't do that 30 years ago.

autoexec · 3 years ago
> That’s why it’s called “attention economy”.

Those are your words. How many people do you think have already read them? How many more do you think will have read them after 10, 20, or 50 years?

If you'd said the same thing before we were using computers to communicate with each other, who would you have said it to? How many people would have heard it? How long would those words have been discoverable? What chance you would you have had to have your words reach millions of people?

Forget about you, what chance would I and everyone else have? Speaking as someone who wasn't born after the internet it's much easier now than it was when I was a child for my words to reach a massive number of people, and every child alive today has a better chance than I did at their age.

Attention is scarce, when you've got near instant communication over a global network of billions of people you only need a tiny fraction of them to be looking your way and you've got your ideas onto the screens of millions of people

guerrilla · 3 years ago
> and if you did, they'd hang up on you

Not so much as you'd think. Try it. You'd be surprised.

AlbertCory · 3 years ago
You should write a blog post about that, with transcripts. I bet it'd get lots of attention and go viral.

See how easy it is to become famous? I just told you how :)

autoexec · 3 years ago
these days I'd be surprised if they even pick up in the first place. If I don't know you and I'm not expecting your call you get voicemail. Some folks basically never answer their phone because 'everyone I want to talk to knows to text me'.

I don't doubt you'll eventually reach someone who would stay on the line and chat a bit, but unless you're lucky it could take a while.

1vuio0pswjnm7 · 3 years ago
It is well-established that the human animal is evolved to live in small groups. When people come together in large numbers, it is a special occurrence, limited in time and space. The idea of living "as if" one is always in this situation is unnatural and arguably unhealthy. It is not something we should be promoting or even allowing. We should be promoting small groups.

If I was asked to "regulate" this problem of so-called "altmetrics", and the "attention economy" in general, here is how would I do it.

Twitter used to be based on SMS, but since 2020 it is just a gigantic website like Facebook. These two mega-sized websites are the primary sources of "altmetrics". If we take away the right to create these gigantic outsourced websites, what would happen.

I would place limits on websites that are comprised of so-called user-generated content. For example, if someone wants to run a website with millions of pages, they are free to do so. (If they could actually produce enough content to justify so many pages.) However, they are not free to have millions of different people author the pages. A website could not be a mass scale middleman (intermediaries) for people who wish to publish using the www. A mega-website doing no work to produce content, financed solely by selling people out to advertising could not, for example, supplant an organisation that employs journalists and editors to produce news.

By regulating creation of these mega-websites we could reduce the incentive for advertising. The mega-websites would lose their traffic and disappear. They would be replaced by normal-sized websites that cater to specific audiences.

Allowing a few websites to grow to enormous size while not having to do any work to produce content has been a mistake. Of course they can make billions in advertising revenue. It also allows any notion of egalitarianism in the www's design to be compromised in favour of a sharecropper model, so-called "platforms".

Without oversized websites no one would be able to publish their content to every website in existence. No website would be able to do zero work to create content and yet act as a middleman drawing massive traffic that can be monitised via advertising. That is what these mega-websites like Twitter and Facebook do. They sit between people who create content and people who consume it and sell these people out to advertisers.

The cost of publishing data/information to the www will continue to fall. The technology to make it easy will contnue to advance. We do need to be able to communicate in small groups, as we have always done. That is possible. We do not need to collectively use mega-websites with billions of pages run by a handful of third parties in order to do it. The follow-on effects of millions of people communicating via these third party websites are obviously harmful.

AlbertCory · 3 years ago
Some excellent ideas here. Can you make an argument that they're constitutional, though? Because that would be the legal problem.

Let's take a pre-web TV show: American's Funniest Home Videos. That showed UGC to millions of people, allowing millions to propose their own (of course, the producers were the deciders, at least up to the point where the videos went on the air). So how is that show different from the websites you'd prohibit?

endorphinbomber · 3 years ago
You are saying that we should prohibit large group gatherings, because they are "unnatural" and "unhealthy" yet provide not a single source or even definition what a large group is supposed to be. Quite grandiose and all encompassing statement to be not backed up by even a single mentioned source.
gsatic · 3 years ago
Isn't it already breaking down country wise? Most poorer countries dont have the infra/resources to run these sites. So they have to latch on to the mega nets. But the more developed ones are all inching towards their own nets.
anm89 · 3 years ago
Came here to say this. There is nothing meaningful in our lives which goes completely unscathed from the cultural destruction of the attention economy.
ShamelessC · 3 years ago
> Being famous used to be rather difficult. Of course there were exceptions (writing To Kill a Mockingbird...

Shots fired!

thghtihadanacct · 3 years ago
Humans ... us people things, etc ... are prone to our most immediate concerns. It used to be that predators were our concern, now its social media. We react, we make noises, we continue. The unfortunate thing is that there are no lions to take out the weak anymore.
jostmey · 3 years ago
Having spent over 10 years in a university and been a professor, the problem isn’t attention seeking behavior but a lack of accountability. For example, you can literally make up any data you want in a grant proposal and so long as it sounds right no one can or will double check it. The foundation of academia is rotting, but maybe it’s always been like this
club_tropical · 3 years ago
It has not. And "a lack of accountability" is a band-aid on the real problem: bad gatekeeping. People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else. Trying to whip them into real scientists through transparency and accountability is like trying to achieve security in your home by flinging the gates and doors wide open but slapping cameras and motion detectors everywhere. Either they win, or you get fatigued.
AussieWog93 · 3 years ago
>People getting into science, not for the search for truth

I got into science in order to discover truth and develop new things.

Left after less than a year, got a job as a SWE then eventually started my own business selling used video games of all things.

Have never met a group of people less interested in the truth, or a system that goes to such extreme effort to actively inhibit discovery, than what I experienced in academia.

A friend who works for a private non-profit "researching" cures for cancer has reached the same conclusion there too.

Getting money from rich old people/the government takes precedence over science and it's just sad.

autoexec · 3 years ago
> the real problem: bad gatekeeping. People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else.

What possible gate could keep out people who get into science for the wrong reasons? As long as there are people willing to pay off a scientist, there will always be scientists willing to pocket the money in exchange for lies. As long as scientists are well respected, people who want to be respected will give consideration to becoming a scientist.

All we can can do is open the gates and doors, keep an eye on everyone who comes in, and kick out the ones who cause problems. In fact, it's better not to be aggressive about gatekeeping when it comes to science because otherwise you risk kicking out someone who could have discovered something important. Transparency and verification make for good science anyway so it's not as if anyone would be saving money on 'cameras and motion detectors' by locking the gates and doors and only letting their friends in.

Accountability is the key here and the lack of it is not just limited to scientists. Not long ago in the US a doctor who believes that tumors and cysts are caused by the sperm of demons found her way into international headlines. She is also on record saying that medications contain the DNA of aliens. She still got to keep her medical license and she has a medical practice. There is almost no accountability at all.

kajaktum · 3 years ago
I think there are very few people who are truly scientists in that they spend their lives trying to solve multiple problems. I think most people have 1 or 2 grand ideas and they execute them. I've seen "scientist"s that are truly intellectually bankrupt, he just needs to do certain things throughout the year to pass some checks and get paid.
marcosdumay · 3 years ago
> People getting into science, not for the search for truth, but in search of respectability, green card, money, or whatever else.

I doubt that. It's much easier to get all of those things elsewhere (ok, maybe not the green card, but the fact that this is not a US-only problem also discards this).

If it is an issue of systemic corruption, people are getting corrupted after they walk in, not before.

godelski · 3 years ago
> lack of accountability.

I think this also applies at all levels. I'd argue that it is the big reason people feel very frustrated with reviewing, and especially in hyped areas (e.g. ML). There's plenty of incentives to reject papers (low acceptance rates mean "higher quality" publications, advantage yourself by rejecting or being overly critical of your competitors, you get no clout for reviewing and no one will get upset if you take all of 5 minutes to review), but very few incentives (I can't even name one) to accept papers. It is fairly easy to dismiss papers as not novel because we all build off the shoulders of giants and things are substantially more obvious post hoc. Metareviewers and Area Chairs will 99/100 times stand with reviewers even if they are in the wrong and can be proven so (I had a reviewer give me a strong reject claiming I should compare to another work, which was actually our main comparitor and we compared to in 5+ tables and 5+ graphs). I can't see these issues being resolved until we all agree that there needs to be some incentive to write high quality reviews. The worst of this is that the people it hurts the most is the grad students and junior researchers (prevents graduating and career advancement). I'm not saying we have to accept papers, but I am saying we need to ensure that we are providing high quality reviews. Rejections suck, but rejections that don't provide valuable feedback are worse.

If the publication system is a noisy signal then we need to fix it AND recognize it as such. There's been plenty of studies to show that this process is highly noisy but we're all acting like publications are all that matters.

This is all before we even talk about advantages linked to connections even in double blind reviews, collusion rings, or citation hacking. I feel we can't even get the first step right.

enviclash · 3 years ago
We all get papers rejected. Nobel prizes get also papers rejected. And rejection is one of the very few tools we have to stop the crap from flowing in. This is not to say that all rejected papers are crap. Sorry to hear about your bad time with rejections, this is a universal thing.
chaosbolt · 3 years ago
The biggest newspapers are writing stories with no proof or facts, and no one cares anymore.

No one cares, when you can watch 5 movies in a day and feel all sorts of emotions people used to only feel a few times a year, watch porn, play videogames, read about anything you want, you have little time or neurochemistry left to be mad at scientist, journalists or politicians lying to you.

colejohnson66 · 3 years ago
Let’s not pretend that yellow journalism is a recent thing. It’s been around since before the automobile. The difference is that we have internet. So it’s easier to fact check, but it’s also easier for lies to spread more widely before being corrected.
wrp · 3 years ago
I have read a lot about the history of science in Britain. Even before the creation of the Royal Society, people doing science generally all knew each other and communicated extensively, so I think there was a high degree of accountability. Even in the 19th century, scientific sub-communities were small. I suspect that the problem of accountability arose in the 20th century with the expansion of institutional science.
anonporridge · 3 years ago
> but maybe it’s always been like this

The older I get, the more I believe this is the truth. For most institutions we've been taught to hold in high regard.

marcus_holmes · 3 years ago
This. I'm actually optimistic (I know, Unpopular Opinion) about the future for a bunch of reasons, but one of them is the increased transparency that we now have on our institutions.

It used to be that the media controlled how much of what went on that we saw, and the media was part of an establishment that valued the status quo and "stability" so didn't report on some things.

Now, the blinders are off and we're seeing what was always there.

For Science, this means that the institutions that were set up by a bunch of rich men who could afford to spend their time satisfying their curiosity, are now visibly falling apart. Because we can see this, we can change it to something better. The pressure to kill off the journal system is growing. The various crises that the article mentions, etc. This is all good, in the long term.

derbOac · 3 years ago
I don't know. I've been around academics for awhile now and really tried to get a good answer to this and I still don't know. If anything I don't think it's always been like this.

There's a variety of indicators suggesting that something changed in the mid90s to about 2000 with academics (grant success rates, publications, etc). When I talk to much more senior colleagues, I also get the sense something changed even if they describe it in different terms.

Was everything better before? No, I think many things are better now, but it feels like the fundamentals of the system are worse. To me it feels like being in a house that's been remodeled with a new roof and HVAC system, but where the ground is sinking and foundation is in the process of collapsing.

chiefalchemist · 3 years ago
While I hate to parse words, by definition, fabrication of data it's not science. It's fraud.
DiggyJohnson · 3 years ago
That's just restating the problem in a single, flashy sentence.
wisty · 3 years ago
As far as I can tell, there's way more accountability in science now, just like there was more accountability in a Soviet factory than an American one in the Cold War. It just doesn't always achieve its intended results.
Ztynovovk · 3 years ago
You have any data/research to support your experience?
viridian · 3 years ago
What most people fail to realize is that fundamentally, most principle investigators, the people who actually run the research world, are primarily fundraisers. Their day-to-day job is a mix of grant and proposal writing, relationship building/organizational meetings, and checking in on their postdocs, candidates, and lab techs.

Your main goal, as a PI, is to keep your lab running, and thus research flowing, by any means possible. For some PIs, this means milking every drop of available talent and time out of your doctoral candidates, and is the most common cause of the horror stories you hear about people leaving academia. On the other end of the extreme, they can embed themselves so deeply in fundraising with private or public capital that their lab staff don't see them for more than 15-30 minutes a week, because they are essentially living their lives hopping from sales meeting to sales meeting.

This wouldn't be a problem if the job of the PI was explicitly meant to be that of a salesman, but the actual role of a PI is to define the research being done. They draft the hypotheses, the expected impact, etc, because that is their intended role, but in reality these will always be constructed in a way that makes it easier for the PI to solicit funding.

It's impossible for the attention economy to not play into the research funding loop then, because every set of eyeballs is another potential revenue source for future research, or a tool to justify growing the footprint of your lab. I wouldn't go so far as to call the superstructure corrupting of science though, not in those words. I'd say it forces science to be mission focused, where the mission is a subtle negotiation between the people funding the research and the people performing it, and often times the person with the capital lands much closer to their ideal.

derbOac · 3 years ago
Just speaking from personal experience, what you describe gets murky really fast in a couple of ways.

The model you're describing implicitly is this idealistic "executive-mentor" model of the PI, who has the ideas and the postdocs or doctoral students are just implementing it. Basically scientist wants to produce, so needs help and outsources the work to others.

In my experience, though, this is not at all what happens many times. Ideas come from those doctoral students or postdocs or whowever, the PI takes them, and then they get credit for those ideas. I've seen PIs who really don't fundamentally understand the research areas, who kind of are just "black holes" for credit of the ideas and work of others around them, and then because they're more senior, they end up getting the credit. Sometimes this process seems intentional, in that the PI cultivates a false impression of what's going on, and sometimes it just happens because of the nature of the attention economy.

So although the "executive-mentor" model is a good one, what's closer to reality in many cases (although not all) is more of a "public liason-mascot" system, or some kind of hybrid.

Because of this mismatch between reality and the assumed schema, the attention economy then incentivizes abuse and corruption.

This isn't even getting into issues about how chasing grants as a fundamental scientific endeavor distorts what is researched. Even if you have a pure leader-mentor PI who is just trying to get their own independent ideas researched with funding, you then have to ask "what is rewarded? Is it what's good rigorous science, or what is popular?"

The problem I think is that what garners attention is not what is rigorous or innovative. Sometimes those things overlap, and maybe they're correlated, but they're not the same.

Maybe this isn't unique to science, but it doesn't make it ok, and it seems like changing it to prevent these problems is necessary.

syncerr · 3 years ago
Attention is not the problem; it's the lack of accountability. Social platforms care about engagement, not quality of content (there's virtually no mechanism to incentivize content meets any standard of quality other than what can be measured in the moment).
Nextgrid · 3 years ago
Quality is subjective, but there’s no accountability about harmful or illegal content either, so platforms don’t only promote “general purpose” spam, but actively harmful content that intentionally seeds outrage or encourages violence as that generally leads to more engagement.
syncerr · 3 years ago
Absolutely, it should include a range of indicators like spam, scams, known-falsehoods and unsubstantiated claims.

Deleted Comment

o_1 · 3 years ago
Quality is subjective? Is it information or a product? Or is all information now a product?
sinenomine · 3 years ago
Why go to such a long tangent, when you could make a solid case about the legacy grant distribution system[1] corrupting science for decades? It is as close to funding and career success as it gets.

https://newscience.org/nih/

Gatsky · 3 years ago
Sometimes I am left to wonder about the widespread criticism of science. Slow progress, broken publishing and career progression, terrible working conditions, prestige farming, poor mental health and exploitation of students, fake data, statistical warcrimes, bullying, sexism, racism, elitism, harrassment...

The question is, does any of this matter on a historical scale? Is Science doomed to fail? In 200 years, our descendants will probably look back at us with the same mix of condescension and slightly horrified fascination with which we view our 19th century counterparts. Our stupid scientific publishing system will be viewed similarly to the plumbing in London in the 1850s. The chimney sweep and the graduate student suffer similar plights. It is terrible, immoral, we should do better, but then this is always the case.

On one level, our future descendants should be grateful - we eradicated small pox (that alone would be enough really, an unprecedented gift to all future humanity, a boundless alleviation of suffering), discovered antibiotics, greatly improved child mortality, invented quantum mechanics and relativity, drastically increased our computational capacity, left the planet for the first time, and connected almost everyone in the world. All of this under conditions considerably less ideal than they are today, despite still being far from desirable. Maybe that's all that matters?

Science being Science will sort itself out most likely. We should mainly try to reduce the human suffering involved, and allow greater diveristy in how and where research is done.

n4r9 · 3 years ago
My main concern is that perhaps there is an "attention threshold" on fundamental breakthrough results in physics and mathematics, and that we're diminishing the chances of ever getting such results again because theorists don't have enough attention to spare. If you spend all day writing grant proposals and trivial papers (just for the sake of publishing), then go home and veg out on Netflix and Twitter, when are you supposed to have that quality "sitting and thinking" time?
karencarits · 3 years ago
That the society is moving away from the concept of calling or vocation is also a problem. As a scientist (or physician or nurse or whatever) you are no longer rewarded for devoting your life to such things. You will not be supported (maybe even shamed for your enthusiasm or for failing in the reas of life that you are sacrifying), noone will watch your back, the administration will blame you for what they can, and you will not be paid overtime of course
enviclash · 3 years ago
"Science being Science will sort itself out most likely" defies logic of current events.
Gatsky · 3 years ago
Most of the problems are with the Academies. They are of course deeply flawed institutions like all human institutions, and generally overrated. But Science isn't merely the Academies.
wanderingmind · 3 years ago
" Scientists list media exposure counts on résumés, and many PhD theses now include the number of times a candidate’s work has appeared in the popular science press." This is a mandatory requirement to a EB1A green card. Maybe the government can do something from its side to reduce the fluff.
adharmad · 3 years ago
The problem is, any system that replaces it, and has some measurable/countable metric, can effectively be gamed in the same manner.
Fomite · 3 years ago
"The attention a scientist’s work gains from the public now plays into its perceived value. Scientists list media exposure counts on résumés, and many PhD theses now include the number of times a candidate’s work has appeared in the popular science press. Science has succumbed to the attention economy."

Sitting on a tenure and promotion committee at an R1 university, this type of stuff is just as likely to torpedo you as it is to boost you.

abrax3141 · 3 years ago
I agree. I’ve almost never seen this, and I read a lot of scientific vitae. (The one time I did see it it was because a major documentary by a major scientific org had been made from the results.) I think they mostly made this observation up. (There’s no citation.) Calls the whole thing into question, which sounded pretty simplistic to begin with.
Fomite · 3 years ago
There are only two places I call it out in my CV:

1) A pandemic-related paper, where press coverage was a big deal, because it was meant to influence policy decisions on a very short time frame.

2) One paper where we are in the 99th percentile for media and social media mentions for a very good journal, and we call that out specifically.

But as a general rule? You're far more likely to trigger "Is that what you're spending your time on?" and "No serious about scholarship" than you are to find someone impressed with your altmetric score. Indeed, I'm worried about several of my colleagues who are learning hard in that direction (being the Social Media Editor of a journal, etc.)

seydor · 3 years ago
The question is when? Surely , with passage of time, it s more likely scams will be revealed. But if the professor has already retired, the damage and lost effort is irreversible.