Readit News logoReadit News
dekhn · a year ago
These sorts of articles raise so many thoughts and emotions in me. I was trained as a computational biologist with a little lab work and ran gels from time to time. Personally, I hated gels- they're finicky, messy, ugly, and don't really tell you very much. But molecular biology as a field runs on gels- it's the priimary source of results for almost everything in molbio. I have seen more talks and papers that rested entirely a single image of a gel which is really just some dark bands.

At the same time, I was a failed scientist: my gels weren't as interesting, or convincing compared to the ones done by the folks who went on to be more successful. At the time (20+ years ago) it didn't occur to me that anybody would intentionally modify images of gels to promote the results they claimed, although I did assume that folks didn't do a good job of organizing their data, and occasionally published papers that were wrong simply because they confused two images.

Would I have been more successful if fewer people (and I now believe this is a common occurrence) published fraudulent images of gels? Maybe, maybe not. But the more important thing is that everybody just went along with this. I participated in many journal clubs where folks would just flip to Figure 3, assume the gel was what the authors claimed, and proceed to agree with (or disagree with) the results and conclusions uncritically. Whereas I would spend a lot of time trying to understand what experiment was actually run, and what th e data showed.

testfoobar · a year ago
Similar - when I was younger, I would never have suspected that a scientist was committing fraud.

As I've gotten older, I understand that Charlie Munger's observation "“Show me the incentive and I will show you the outcome.” is applicable everywhere - including science.

Academic scientists' careers are driven by publishing, citations and impact. Arguably some have figured how to game the system to advance their careers. Science be damned.

klenwell · a year ago
I think my favorite Simpsons gag is the episode where Lisa enlists a scientist (voiced by Stephen Jay Gould) to run tests to debunk some angel bones that were found at a construction site.

In the middle of the episode, the scientist bicycles up to report, dramatically, that the tests "were inconclusive".

In the end, it's revealed that the bones were a fraud concocted by some mall developers to promote their new mall.

After this is revealed, Lisa asks the scientist about the tests. He shrugs:

"I'm not going to lie to you, Lisa. I never ran the tests."

It's funny on a few levels but what I find most amusing is that his incentive is left a mystery.

sitkack · a year ago
If humanity is to mature, we should be an open book when it comes to incentives and build a world purposefully with all incentives aligned to the outcomes we collectively agree upon.

https://fs.blog/great-talks/psychology-human-misjudgment/

Charlie Munger's Misjudgment #1: Incentive-caused Bias https://www.youtube.com/watch?v=h-2yIO8cnvw

https://fs.blog/bias-incentives-reinforcement/

almostgotcaught · a year ago
> Arguably some have figured how to game the system to advance their careers

lol arguably? i would bet my generous, non-academia, industry salary for the next 10 years, that there's not a single academic with a citation count over say ... 50k (ostensibly the most successful academic) that isn't gaming the system.

- signed someone who got their phd at "prestigous" uni under a guy with >100k citations

huijzer · a year ago
> Academic scientists' careers are driven by publishing, citations and impact. Arguably some have figured how to game the system to advance their careers. Science be damned.

I’ve talked to multiple professors about this and I think it’s not because they don’t care about science. They just care more about their career. And I don’t blame them. It’s a slippery slope and once you notice other people who start beating you, then it’s very hard to stay on the righteous pad[note]. Heck I even myself in the PhD have written things I don’t agree with. But at some point you have to pick your battles. You cannot fight every point.

In the end I also don’t think they care that much about science. Political parties often push certain ideas more or less depending on their beliefs. And scientist know this since they will often write their own ideas such that it sounds like it solves a problem for the government. If you think about it, it’s kind of a miracle that sometimes something good is produced from all this mess. There is some beauty to that.

[note] I’m not talking about blatant fraud here but about the smaller things like accepting comments from a reviewer which you know are incorrect, or using a methodology that is the status quo but you know is highly problematic.

infamouscow · a year ago
The Manhattan project was a government project that was run like a startup.

If such a project happened today, academic scientists would be trying to figure out ways to bend their existing research to match the grants. Then it would take another 30 years before people started to ask why nothing has been delivered yet.

nextos · a year ago
Lots of people doing research find this depressing to the point of quitting. Many of my peers left research as they couldn't stomach all this nonsense. In experimental fields, the current academic system rewards dishonesty so much that ugly things have become really common.

In my relatively short career, I have been asked to manipulate results several times. I refused, but this took an immense toll, especially on two occasions. Some people working with me wanted to support me fighting dishonesty. But guess what, they all had families and careers and were ultimately not willing to do anything as this could jeopardize their position.

I've also witnessed first-hand how people that manage to publish well adopt monopolistic strategies, sabotaging interesting grant proposals from other groups or stalling their article submissions while they copy them. This is a problem that seldomly gets discussed. The current review system favors mono-cultures and winner-takes-it-all scenarios.

For these reasons, I think industrial labs will be doing much better. Incentives there are not that perverse.

_heimdall · a year ago
The fact that it can still be considered science when intentional fraud is involved is a huge problem itself.
Dalewyn · a year ago
>Charlie Munger's observation

The more I've read about finances the more I've realized it can also be applied to many other things in the world due to its sheer objectivity.

On the other hand, I've also noticed most if not all of it is based on contexts and data from the mid 20th century. Interesting how that turns out.

SkyMarshal · a year ago
> Academic scientists' careers are driven by publishing, citations and impact.

Publishing and citations can and are gamed, but is impact also gamed on a wide scale? That one seems harder to fake. Either a result is true and useful, or it's not.

lazyeye · a year ago
How much do these same incentives apply to climate science, where huge amounts of money are now in play?
neycoda · a year ago
That attitude coincides with current delusion in our society that science is perpetuating a fraud at the level of religions whose leaders are trying to control their flock for financial and sexual gain.

A broken system that incentivizes fraud over knowledge is a real problem.

An assertion that scientists chase the money by nature is a dangerous one that will set us back to the stone age when instead we should be traversing the space as a whole.

Dead Comment

antisthenes · a year ago
> Similar - when I was younger, I would never have suspected that a scientist was committing fraud.

Unfortunately many less bright people seem to interpret this as "never trust science", when in reality science is still the best way to push humanity forward and alleviate human suffering, _despite_ all the fraud and misaligned incentives that may influence it.

asdf123qweasd · a year ago
At some point, the good scientists leave and the fraudsters start to filter for more fraudsters. If that goes on, its over- the academia has gone. Entirely. It can not grow back. Its just a building with conman in labcoats.

My suggestion stands: Give true scientists the ability to hunt fraudsters for budgets. If you hunt and nail down a fraudster, you get his funding for your research.

bluecheese452 · a year ago
All the fraudsters will nail the honest ones before they know what hit them.
Balgair · a year ago
I mean, the replication crisis had come and gone, about 5 years now. The fraudsters are running the place and have been for at least the last half decade, full stop.
seanmcdirmid · a year ago
It becomes a survival bias: if people can cheat at a competitive game (or research field) and get away with it, then at the end you'll wind up with only cheaters left (everyone else stops playing).
inglor_cz · a year ago
You could improve the situation by incentivizing people to identify cheaters and prove their cheating. If being a successful cheater-hunter was a good career, the field would become self-policing.

This approach opens its own can of worms (you don't want to overdo it and create a paranoid police-state-like structure), but so far, we have way too little self-policing in science, and the first attempts (like Data Colada) are very controversial among their peers.

dennis_jeeves2 · a year ago
As they say: the scum rises to the top, true for academia, politics etc, any organization really.

Quote: "The Only Thing Necessary for the Triumph of Evil is that Good Men Do Nothing"

My own nuanced take on it:

Incompetent people are quick to grab authority and power. On the other hand principled, competent people are reluctant to take on positions of authority and power even when offered. For these people positions of power a)have connotations of a tyrant b) are uninteresting. (i.e technical problems are more interesting) . Also the reluctance of principled people to form coalitions to keep out the cheaters, because they are a divided bunch themselves exacerbates the problem, where as the cheaters often can collude together (temporarily) to achieve their nefarious goals.

yard2010 · a year ago
This is why cheaters should never ever ever be allowed to play again with fair players, only with cheaters
araes · a year ago
And thus we have the Earth. Where all looks like a broken MMO in every direction. Everybody refuses to participate, because it's 100% griefers, yet nobody can leave.

Business: Can you get a law written to command the economy to give you money or never suffer punishments? Intel fabs (https://reason.com/2024/03/20/federal-handout-to-intel-will-...), Tesla dealers (https://en.wikipedia.org/wiki/Tesla_US_dealership_disputes), Uber taxis (https://www.theguardian.com/news/2022/jul/10/uber-files-leak...), ect... Are you wealthy enough there's nothing "normals" can really do? EBay intimidation scandal (https://en.wikipedia.org/wiki/EBay_stalking_scandal).

Economic Academia: Harvard Prof. Gino (https://www.thecrimson.com/article/2024/4/11/harvard-busines...)

Materials Academica: Doping + Graphene = feces papers (https://pubs.acs.org/doi/pdf/10.1021/acsnano.9b00184) "Will Any Crap We Put into Graphene Increase Its Electrocatalytic Effect?" (Bonus joke! Crap is actually a better dopant material.)

Gaming: Roblox double cut on sales (that people mostly just argue about how enormous it is, because the math's purposely confusing) (https://news.ycombinator.com/item?id=28247034)

Politics: Was Santos ever actually punished?

Military: The saga of the Navy, Pacific Fleet, and Fat Leonard (https://en.wikipedia.org/wiki/Fat_Leonard_scandal) "exploited the intelligence for illicit profit, brazenly ordering his moles to redirect aircraft carriers, ships and subs to ports he controlled in Southeast Asia so he could more easily bilk the Navy for fuel, tugboats, barges, food, water and sewage removal."

Work: "Loyal workers are selectively and ironically targeted for exploitation" (https://www.sciencedirect.com/science/article/abs/pii/S00221...)

There's others, that's just already so many...

lupire · a year ago
Anything not Forbidden is Compulsory.
itronitron · a year ago
I used to work with someone up until the point I realized they were so distant from any form of reality that they couldn't distinguish between fact or fiction.

Naturally, they are now the head of AI where they work.

wredue · a year ago
Hacker news is completely flooded with “AI learns just like humans do” and “AI models the human brain” despite neither of these things having any concrete evidence at all.

Unfortunately it isn’t just bosses being fooled by this. Scores of people push this crap.

I am not saying AI has no value. I am saying that these idiots are idiots.

RunSet · a year ago
Calls to mind Isaac Asimov's "shotgun curve".

https://archive.org/details/Fantasy_Science_Fiction_v056n06_...

rcxdude · a year ago
That story reminds me of this gem: https://pages.cs.wisc.edu/~kovar/hall.html
schmidtleonard · a year ago
Similar story: computational biologist, my presentations involved statistics so people would come to me for help, and it often ended in the disappointing news of a null result. I noticed that it always got published anyway at whichever stage of analysis showed "promise." The day I saw someone P-hack their way to the front page of Nature was the day I decided to quit biology.

I still feel that my bio work was far more important than anything I've done since, but over here the work is easier, the wages are much better, and fraud isn't table stakes. Frankly in exchange for those things I'm OK with the work being less important (EDIT: that's not a swipe at software engineering or my niche in it, it's a swipe at a system that is bad at incentives).

Oh, and it turns out that software orgs have exactly the same problem, but they know that the solution is to pay for verification work. Science has to move through a few more stages of grief before it accepts this.

j-wags · a year ago
I'm mostly out now, but I would love to return to a more accountable academia. Often in these discussions it's hard to say "we need radical changes to publicly funded research and many PIs should be held accountable for dishonest work" without people hearing "I want to get rid of publicly funded research altogether and destroy the careers of a generation of trainees who were in the wrong place at the wrong time".

Even in my immediate circles, I know many industry scientists who do scientific work beyond the level required by their company, fight to publish it in journals, mentor junior colleagues in a very similar manner to a PhD advisor, and would in every way make excellent professors. There would be a stampede if these people were offered a return to a more accountable academia. Even with lower pay, longer hours, and department duties, MORE than enough highly qualified people would rush in.

A hypothetical transition to this world should be tapered. But even at the limit where academia switched overnight, trainees caught in such a transition could be guaranteed their spots in their program, given direct fellowships to make them independent of their advisor's grants, given the option to switch advisor, and have their graduation requirements relaxed if appropriate.

It's easy to hem and haw about the institutional knowledge and ongoing projects that would invariably be lost in such a transition, even if very carefully executed. But we have to consider the ongoing damage being done when, for example, biogen spends thousands of scientist-years and billions of dollars failing to make an alzheimers drug because the work was dishonest to begin with, or when generations of trainees learn that bending the truth is a little more OK each year.

nordsieck · a year ago
What's amazing to me is that journals don't require researchers to submit their raw data. At least, as far as I know.

The only option for someone who wants to double check research is to completely replicate a study, which is quite a bit more expensive than double checking the researcher's work.

frickinLasers · a year ago
Journals are incentivized to publish fantastic results. Organizing raw data in a way that the uninitiated can understand presents serious friction in getting results out the door.

The organizations who fund the research are (finally) beginning to require it [0][1], and some journals encourage it, but a massive cultural shift is required and there will be growing pains.

You could also try emailing the corresponding authors. Any good-faith scientist should be happy to share what they have, assuming it's well organized/legible.

[0] https://new.nsf.gov/public-access [1] https://sharing.nih.gov/

gvurrdon · a year ago
It's becoming more common for journals to have policies which require that raw data be made available. Here's some background: https://en.wikipedia.org/wiki/FAIR_data

One of the purposes of a site on which I work (https://fairsharing.org) is to assist researchers in finding places where they might upload their data (usually to comply with publishers' requirements).

insane_dreamer · a year ago
Replicating the results from someone's original data is difficult and time consuming, and other researchers aren't getting paid to do that (they're getting paid to do new research). And of course the (unpaid) reviewers don't have time either.
mfld · a year ago
Re: the role of (gel) images as the key aspect of a publication. To me this is very understandable, as they convey the information in the most succinct way and also constitute the main data & evidence. Faking this is so bold that it seemed unlikely.

The good news IMO: more recent MolBio methods produce data that can be checked more rigorously than a gel image. A recent example where the evidence in form of DNA sequencing data is contested: https://doi.org/10.1128/mbio.01607-23

busyant · a year ago
> don't really tell you very much

???

I think this statement is either meaningless or incorrect. At the very least your conclusion is context dependent.

That being said, I ran gels back in the stone ages when you didn't just buy a stack of pre-made gels that slotted into a tank.

I had to clean my glass plates, make the polyacrylamide solution, clamp the plates together with office binder clips and make sure that the rubber gasket was water tight. So many times, the gasket seal was poor and my polyacrylamide leaked all over the bench top.

I hated running them. But when they worked, they were remarkably informative.

bafe · a year ago
Count me in the club of failed scientists. In my case it was the geosciences, I would spend hours trying to make all my analysis reproducible and statistically sound while many colleagues just published preliminary simulation results obtaining much more attention and even academic jobs. On the flip side, the time spent improving my data processing workflows led to good engineering jobs so the time wasn't entirely wasted
doctorpangloss · a year ago
> raise so many... emotions in me... and I now believe [faking gels] is a common occurrence

On the other hand, shysters always project, and this thread is full of cringe vindications about cheating or faking or whatever. As your "emotions" are probably telling you, that kind of generalization does not feel good, when it is pointed at you, so IMO, you can go and bash your colleagues all you want, but odds are the ones who found results did so legitimately.

smolder · a year ago
Regarding "shysters always project": it rings true to me, but given the topic, I'm primed to wonder how you could show that empirically, and if there's any psychology literature to that effect.
pbreit · a year ago
As long as it's all peer reviewed!!
DigitalPaladin · a year ago
I'm assuming /s above.

Because the amount of pencil-whipped "peer review" feedback I've received could fit in a shoe box, because many "reviewers" are looking for the CV credit for their role and not so much the actual effort of reviewing.

And there's no way to call them out on their laziness except maybe to not submit to their publication again and warn others against it too.

And, to defend their lack of review, all they need to say to the editor anyway is: "I didn't see it that way."

enugu · a year ago
Many solutions involving posting data in repositories or audits are being discussed in the comments.

But given that many people are saying that they noticed and quit academia, how about also creating a more direct 'whistleblower' type of system, where complaints (with detailed descriptions of the fraud or a general view on what one sees in terms of loose practices) goes to some research monitoring team which can then come in and verify the problems.

cwalv · a year ago
> how about also creating a more direct 'whistleblower' type of system

There needs to first be a system of checks and balances for this to work. The people at the top already know and condone the behavior; who are the whistleblowers reporting to?

"We represent the top scientists in our field; these are a group of grad students. Who are you going to believe?"

And of course they can easily shut anyone down with two words: "science denier"

kjkjadksj · a year ago
Gels tell you quite a lot, its what question you are asking that is more relevant to the results being useful over the technique. Of course people lie and cheat in science. Wet lab and dry lab. So many dry lab papers for example are out there where code are supposedly available “by request” and we take the figures on faith.
TheMagicHorsey · a year ago
This is why institutions break down in the long run in any civilization. People like you, people of principle are drown out my agents acting exclusively in their own interest without ethics.

It happens everywhere.

The only solution to this is skin in the game. Without skin in the game the fraudsters fraud, the audience just naively goes along with it, and the institution collapses under the weight of lies.

Balgair · a year ago
The iron laws of bureaucracy are:

1) Nothing else matters but the budget

2) Over the long run, the people invested in the bureaucracy always win out over the people invested in the mission/point

Science is just as susceptible to 2) as anything else.

worik · a year ago
> The only solution to this is skin in the game.

Another solution is the opposite, no skin in the game

Remove cocrete incentives and pay salaries

BobbyTables2 · a year ago
I feel this way about every flashy startup with billion dollar valuations.

It seems amazing that they are pulling off what seems impossible.

Years later, we learn they really aren’t. They unjustifiably made a name for themselves by burning VC money instead of running a successful business.

throwaway14356 · a year ago
Then hiring the uninteresting gel seems preferable.
nonrandomstring · a year ago
> Would I have been more successful

What are you talking about? You _are_ successful. You're not a fraud like all those other tossers.

dekhn · a year ago
To me, at the time, successful would have been getting a tenure-track position at a Tier 1 university, discovering something important, and never publishing anything that was intentional fraud (I'm OK with making some level of legitimate errors that could need to be retracted).

Of those three, I certainly didn't achieve #1 or #2, but did achieve #3, mainly because I didn't write very much and obsessed over what was sent to the editor. Merely being a non-fraud is only part of success.

(note: I've changed my definition of success, I now realize that I never ever really wanted to be a tenured professor at a Tier 1 university, because that role is far less fulfilling that I thought it would be).

SpaceManNabs · a year ago
That is not enough to most people. And if it is enough for others, then it is probably because they were fortunate enough to fall back on something better.
otikik · a year ago
Indeed! You would also been more "successful" selling drugs to teens, or trafficking with human organs. But you did not and that's a good thing.
dennis_jeeves2 · a year ago
> At the time (20+ years ago) it didn't occur to me that anybody would intentionally modify images of gels to promote the results they claimed

Fraud I suspect is only tip of the iceberg, worse still is delusion that what is taught is factually correct. A large portion of mainstream knowledge that we call 'science' is incorrect.

While fraudulent claims are relatively easy to detect, claims that are backed up by ignorance/delusion are harder to detect and challenge because often there is collective ignorance.

Quote 1: "Never ascribe to malice that which is adequately explained by incompetence"

Quote 2:"Science is the belief in the ignorance of experts"

Side note: I will not offer to back up my above statements, since these are things that an individual has to learn it on their own, through healthy skepticism, intellectual integrity and inquiry.

Deleted Comment

owenpalmer · a year ago
> A large portion of mainstream knowledge that we call 'science' is incorrect.

How do you know that? Can you prove it scientifically?

> claims that are backed up by ignorance/delusion

In that case, they are not "backed up"

> I will not offer to back up my above statements

> an individual has to learn it on their own, through ... inquiry

May I "inquire" about your reasoning?

wredue · a year ago
Science sent us to the moon. “Do your own research” sent millions to their graves.

“Do your own research” is a movement that is fraught with grifting and basically foundationally just fraud to the core.

“Science” definitely has some fraudsters, but remains the best institution we have in the search for truth.

Lionga · a year ago
Don't hate the player hate the game. Governments made scientist only survive if they show results and specifically the results they want to see. Otherwise no anymore grants and you are done. Whether the results are fake or true does not matter

"Science" nowadays is mostly BS, while the scientific method (hardly ever used in "science" nowadays) is still gold.

MetaWhirledPeas · a year ago
Do hate the player. People are taught ethics for a reason: no set of rules and laws are sufficient to ensure integrity of the system. We rely on personal integrity. This is why we teach it to our children.
greenavocado · a year ago
A true scientist never says, "trust me" or even worse, "trust the science."

https://www.youtube.com/watch?v=gnPFL0Dr34c

ljosifov · a year ago
You have agency. Yes - the system provides incentives. However, you are not some pass-through nothingness to just accept any incentives. You can chose to not accept the incentives. You can leave the system. You're lucky - it's not a totalitarian system. There will be another area of life and work where the incentives align with your personal morals.

Once you bend your spine and kneel to bad incentives - you can never walk completely upright again. You may think and convince yourself that you can stay in the system with bad incentives, play the game, but still somehow you the player remain platonically unaffected. This is a delusion, and at some level you know it too.

Who knows? If everyone left the system with bad incentives, it maybe that the bad system collapses even. It's a problem of collective action. The chances are against a collapse, that it will continue to go on for some time. So don't count on collapse. And even if one was to happen in your time, it will be scorched earth post collapse for some time. Think as an individual - it's best to leave if you possibly can.

vanderZwan · a year ago
> Don't hate the player hate the game.

When the game is designed by the most succesful players you absolutely should hate the players for creating a shitty game.

Spivak · a year ago
You are clearly deeply disconnected from the actual practice of research.

The best you can really say is that the statistics chops of most researchers is lacking and that someone researching say caterpillars is likely to not really understand the maths behind the tests they're performing. It's not an ideal solution by any means but universities are starting to hire stats and cs department grads to handle that part.

BobaFloutist · a year ago
"Nobody is ever responsible for their own actions. Economics predicting the existence of bad actors makes them not actually bad."
neom · a year ago
I'm the furthest thing from a scientist unless you count 3,000 hours of PBS spacetime, but I love science and so science/academia fraud to me, feels kinda like the worst kinda fraud you can commit. Financial fraud can cause suicides and ruin in lives, sure, but I feel like academic fraud just sets the whole of humanity back? I also feel that through my life I've (maybe wrongly) placed a great deal of respect and trust in scientists, mostly that they understand that their work is of the upmost importance and so the downstream consequences of mucking around are just too grave. Stuff like this seems to bother me more than it rationally should. Are people who commit this type of science fraud just really evil humans? Am I over thinking this? Do scientists go to jail for academic fraud?
vasco · a year ago
Pick up an old engineering book at some point, something from mid 1800's or early 1900's and you'll quickly realize that the trust people put on science isn't what it should be. The scientific method works over a long period of time, but to blindly trust a peer review study that just came out, any study, is almost as much faith as religion, specially if you're not a high level researcher in the same field and have spent a good amount of time reading their methodology yourself. If you go to the social sciences then the amount of crock that gets published is incredible.

As a quick example, any book about electricity from the early 1900's will include quite serious sections about the positive effects of electromagnetic radiation (or "EM field therapies"), teach you about different frequencies and modulations for different illnesses and how doctors are applying them. Today these devices are peddled by scammers of the same ilk as the ones that align your shakras with the right stone on your forehead.

momoschili · a year ago
Going to need some citations here since the texts that I'm familiar with from that time period are "A Treatise on Electricity and Magnetism" by Maxwell (mid-late 1800s) and "A History of the Theories of Aether and Electricity" by E. T. Whittaker, neither of which mentions anything of the sort. I suspect you are choosing from texts that at the time likely would not have been considered academic or standard.
eightysixfour · a year ago
Science is eventually consistent. Scientists, individual papers, and the dogma at any given time may not be. It just takes a long time.
NemoNobody · a year ago
Is absolutely as much faith as religion. Just the basic assumption that the universe didn't come from anything is as baseless as that it does.
hyperG · a year ago
The default state of the human brain almost seems to be a form of anti-science, blind faith in what you already believe, especially if you stand to gain personally from what you believe being true.

What is most incredible to me is even knowing and believing the above, I fall prey to this all the time.

luckydata · a year ago
the best example is psychology. the entire field needs to be scrapped and started over, nothing you read on any of those papers can be trusted, it's just heaping piles of bad research dressed with a thin veil of statistical respectability.
dekhn · a year ago
We use EM radiation for illnesses and doctors apply them. It's one of the most important diagnostic and treatment options we have. I think what you're referring to is invalid therapies ("woo" or snake oil or just plain ignorance/greed) but it's hard to distinguish those from legitimate therapies at times.
mmooss · a year ago
Why do you need to go back 100-170 years, if it's an issue? Aren't there more recent examples?
jeffybefffy519 · a year ago
This proves southpark was right, science is just another form of religion.

Dead Comment

jimbokun · a year ago
I think the error is putting trust in scientists as people, instead of putting trust in science as a methodology. The methodology is designed to rely on trusting a process, not trusting individuals, to arrive at the truth.

I guess it also reinforces the supreme importance of reproducibility. Seems like no research result should be taken seriously until at least one other scientist or group of scientists are able to reproduce the result.

And if the work isn't sufficiently defined to the point of being reproducible, it should be considered a garbage study.

drpossum · a year ago
There is no way to do any kind of science without putting trust in people. Science is not the universe as it is presented. Science is the human interpretation of observation. People are who carry out and interpret experiments. There is no set of methodology you can adopt that will ever change that. "Reproducibility" is important, but it is not a silver bullet. You cannot run any experiment exactly in the same way ever.

If you have independent measurements you cannot rule out bias from prior results. Look at the error bars here on published values of the electron charge and tell me that methodology or reproducibility shored up the result. https://hsm.stackexchange.com/questions/264/timeline-of-meas...

doitLP · a year ago
The way I sum it up is: science is a method, which is not equivalent to the institution of science, and because that institution is run by humans it will contain and perpetrate all the ills of any human group.
wordsinaline · a year ago
This error really went viral during the pandemic and continues to this day. We're in for an Orwellian future if the public does not cultivate some skeptic impulse.
Der_Einzige · a year ago
Science is an anarchic enterprise. There is no "one scientific method", and anyone telling you there is has something to sell to you (likely academic careerism). https://en.wikipedia.org/wiki/Against_Method
inglor_cz · a year ago
I think it is fine to put some trust into concrete individual scientists who have proven themselves reliable.

It is not fine to put trust into scientists in general just because they walk around in a lab coat with a PhD label on its front.

yunwal · a year ago
How does this work for things like COVID vaccines, where waiting for a reproduction study would leave hundreds of thousands dead? Ultimately there needs to be some level of trust in scientific institutions as well. I do think placing higher value on reproducibility studies might help the issue somewhat, but I think there also needs to be a larger culture shift of accountability and a higher purpose than profit.
LeifCarrotson · a year ago
You're far from a scientist, so it's easy for you to put scientists/academia on a pedestal.

For most of the people who end up in these scandals, this is just the day job that their various choices and random chance led up to. they're just ordinary humans responding to ordinary incentives in light of whatever consequences and risks they may or may not have considered.

Other careers, like teaching, medicine, and engineering have similar problems.

smolder · a year ago
And management consulting. But probably not plumbing, where the work product is very concrete and easy to judge.
jessriedel · a year ago
As a scientist, I agree, although for not quite the reason you gave. Scientists are given tremendous freedom and resources by society (public dollars, but also private dollars like at my industry research lab). I think scientists have a corresponding higher duty for honesty.

Jobs at top institutions are worth much more than their nominal salary, as evidenced by how much those people could be making in the private sector. (They are compensated mostly in freedom and intellectual stimulation.) Unambiguously faking data, which is the sort of thing a bad actor might do to get a top job, should be considered at least as bad a moral transgression as stealing hundreds of thousands or perhaps a few million dollars.

(What is the downside? I have never once heard a researcher express feeling threatened or wary of being falsely/unjustly accused of fraud.)

kzz102 · a year ago
In my view, prosecuting the bad actors alone will not fix science. Science is by its own nature a community because only a small number of people have the expertise (and university positions) to participate. A healthy scientific discipline and a healthy community are the same thing. Just like the "tough on crime" initiative alone often does not help a problematic community, just punish scientific fraud harshly will not fix the problem. Because the community is small, to catch the bad actors, you will either have insiders policing themselves, or have an non-expert outsiders rendering judgements. It's easy for well-intention-ed policing effort to turn into power struggles.

This is why I think the most effective way is to empower good actors. Ensure open debate, limit the power of individuals, and prevent over concentration of power in a small group. These efforts are harder to implement than you think because they run against our desire to have scientific superstars and celebrities, but I think they will go a long way towards building a healthy community.

pclmulqdq · a year ago
"Broken windows" policing has been shown to work, which I think is what you mean by "tough on crime."
madmask · a year ago
I agree with you, science fraud is terrible. It pollutes and breaks the scientific method. Enormous resources are wasted, not just by the fraudster but also by all the other well meaning scientists who base their work on that.

In my experience no, most fraudsters are not evil people, they just follow the incentives and almost non-existent disincentives. Scientist has become just a job, you find all kinds of people there.

As far as I know no-one goes to jail, worst thing possible (and very rare) is losing the job, most likely just the reputation.

mden · a year ago
"most fraudsters are not evil people, they just follow the incentives and almost non-existent disincentives"

Maybe I'm too idealistic but why does following incentives with no regard for secondary consequences not evil?

dfedbeef · a year ago
That is what evil usually looks like. You expected horns and fire?
photochemsyn · a year ago
It's complicated. Historically scientific fraud could be construed as 'good-intentioned' - typically a researcher in a cutting edge field might think they understood how a system worked, and wanting to be first to publish for reasons of career advancement, would cook up data so they could get their paper into print before anyone else.

Indeed, I believe many academic careers were kicked off in this manner. Where it all goes wrong is when other more diligent researchers fail to reproduce said fraudulent research - this is what brought down famous fraudster Jan Hendrik Schön in the field of plastic-based organic electronics, which involved something like 9 papers in Science and Nature. There are good books and documentaries on that one. This will only be getting worse with AI data generation, as most of those frauds were detected by banal data replication, obvious cuts and pastes, etc.

However, when you add a big financial driver, things really go off the rails. A new pharmaceutical brings investors sniffing for a big payout, and cooking data to make the patentable 'discovery' look better than it is is a strong incentive to commit egregious fraud. Bug-eyed greed makes people do foolish things.

pclmulqdq · a year ago
People like us think scientists care about big-money things, but they largely don't care about that stuff as much as they care about prestige in their field. Prominent scientists get huge rewards of power and influence, as well as indirect money from leveraging that influence. When you start to think that way, the incentives for fraud become very "minor" and "petty" compared to what you are thinking of.
andrewflnr · a year ago
> Stuff like this seems to bother me more than it rationally should.

It's bothering you a rational amount, actually. These people have done serious damage to lots of lives and humanity in general. Society as a whole has at least as much interest in punishing them as it does for financial fraudsters. They should burn.

tppiotrowski · a year ago
There was a period of time when science was advanced by the aristocrats who were self funded and self motivated.

Once it became a distinguished profession the incentives changed.

"When a measure becomes a target, it ceases to be a good measure"

biorach · a year ago
> There was a period of time when science was advanced by the aristocrats who were self funded and self motivated.

From a distance the practice of science in early modern and Enlightenment times might look like the disinterested pursuit of knowledge for its own sake. If you read the detailed history of the times you'll see that the reality was much more messy.

hilux · a year ago
Goodhart's Law!
Balgair · a year ago
Generally, the fields that have a Nobel in them attract the glory hounds and therefore the fraudsters. The ones that don't, like geology or archeology for example, don't get the glory hounds.

Anytime you see champagne bottles up on a professor's top shelf with little tags for Nature publications (or something like that), then you know they are a glory hound.

When you see beer bottles in the trash, then you know they're in it for more than themselves.

Electricniko · a year ago
It seems like this could ultimately fall under the category of financial fraud, since the allegations are that he may have favorably misrepresented the results of drug trials where he was credited as an inventor of the drug that's now worth hundreds of millions of dollars.
mistercheph · a year ago
Evil is a much simpler explanation than recognizing that if you were in the same position with the same incentives, you would do the same thing. It's not just one event, it's a whole career of normalizing deviation from your values. Maybe you think you'd have morals that would have stopped you, maybe those same morals would have ensured you were never in a position to PI research like that.
mhh__ · a year ago
Scientific fraud can also compound really badly because people will try to replicate it, and the easiest results to fake are usually the most expensive...
edem · a year ago
I also watched almost all episodes of PBS Spacetime. Some of them multiple times. I'm so happy that Spacetime exists and also that Matt was recruited as a host (in place of Gabe). Highly recommended channel, superb content!
dghlsakjg · a year ago
It is the same flavor of fraud as financial fraud. It is about personal gain, and avoiding loss.

This kind of fraud happens because scientists are rewarded greatly for coming up with new, publishable, interesting results. They are punished severely for failing to do that.

You could be the department's best professor in terms of teaching, but if you aren't publishing, your job is at risk at many universities.

Scientists in Academia are incentivized to publish papers. If they can take shortcuts, and get away with it, they will. That's the whole problem, that's human nature.

This is why you don't nearly as many industry scientists coming out with fraudulent papers. If Shell's scientists publish a paper, they aren't rewarded for that, if they come up with some efficient new way to refine oil they are rewarded, and they also might publish a paper if they feel like it.

janice1999 · a year ago
> If Shell's scientists publish a paper

A lot of companies reward employees for publications. Mine certainly does. Also an oil company may not be such a great example since they directly and covertly rewarded scientists for publishing papers undermining climate change research.

wredue · a year ago
Can you go to jail for knowingly defrauding another entity out of money (such as grants). Yes. Absolutely.

Are you going to go to jail for fudging some numbers on your paper, not likely.

pclmulqdq · a year ago
IMO faking a paper and then sticking that in your CV on a grant application is fraud enough to be worthy of jail.
transcranial · a year ago
As a collective endeavor to seek out higher truth, maybe some amount of fraud is necessary to train the immune system of the collective body, so to speak, so that it's more resilient in the long-term. But too much fraud, I agree, could tip into mistrust of the entire system. My fear is that AI further exacerbates this problem, and only AI itself can handle wading through the resulting volume of junk science output.
regus · a year ago
This is pretty funny. I usually hear this kind of language when a religious person is so devastated when their priest or pastor does something wrong that it causes them to leave their religion altogether. Are you going to do the same thing for scientism?
neom · a year ago
I'm not a particularly religious person, I didn't realize what you described is something that happens with any great frequency. Never the less, I suppose one is able to leave a particular place of worship and not leave a religion, as it is with any way people form their views on something societal like this, it's on a spectrum? Religion, Politics, Science, Sex, Education, whatever.
xnx · a year ago
Science is the search for truth. Lying is anthemic to that.
brightball · a year ago
How this happens given the near reverence provided to “peer review” is another question.
eig · a year ago
This sort of behavior is only going to worsen in the coming decades as academics become more desperate. It's a prisoner's dilemma: if everyone is exaggerating their results you have to as well or you will be fired. It's even more dire for the thousands of visa students.

The situation is similar to the "Market for lemons" in cars: if the market is polluted with lemons (fake papers), you are disincentivized to publish a plum (real results), since no one can tell it's not faked. You are instead incentivized to take a plum straight to industry and not disseminate it at all. Pharma companies are already known to closely guard their most promising data/results.

Similar to the lemon market in cars, I think the only solution is government regulation. In fact, it would be a lot easier than passing lemon laws since most labs already get their funding from the government! Prior retractions should have significant negative impact on grant scores. This would not only incentivize labs, but would also incentivize institutions to hire clean scientists since they have higher grant earning potential.

jimbokun · a year ago
My recommendation is for journals to place at least equal importance to publishing replications as for the original studies.

Studies that have not been replicated should be published clearly marked as preliminary results. And then other scientists can pick those up and try to replicate them.

And institutions need to give near equal weight to replications as to original research when deciding on promotions. Should be considered every researchers responsibility to contribute to the overall field.

mlsu · a year ago
We can solve this at the grant level. Stipulate that for every new paper a group publishes from a grant, that group must also publish a replication of an existing finding. Publication would happen in pairs, so that every novel thing would be matched with a replication.

Replications could be matched with grants: if you receive $100,000 grant, you'd get the $100,000 you need, plus another $100,000 which you could use to publish a replication of a previous $100,000 grant. Researchers can choose which findings they replicate, but with restrictions, e.g. you can't just choose your group's previous thing.

I think if we did this, researchers would naturally be incentivized to publish experiments that are easier to replicate and of course fraud like this would be caught eventually.

I bet we could throw away half of publications tomorrow and see no effect on the actual pace of progress in science.

calebh · a year ago
This stuff happens in Computer Science too. Back around 2018 or so I was working on a problem that required graph matching (a relaxed/fuzzy version of the graph isomorphism problem) and was trying algorithms from many different papers.

Many of the algorithms I tried to implement didn't work at all, despite considerable effort to get them to behave. In one particularly egregious (and highly cited) example, the algorithm in the paper differed from the provided code on GitHub. I emailed the authors trying to figure out what was going wrong, and they tried to get funding from me for support.

My manager wanted me to right a literature review paper which skewered all of these bad papers, but I refused since I thought it would hurt my career. Ironically the algorithm that ended up working the best was from one of the more unknown papers, with few citations.

scarmig · a year ago
You should be able to build an entire career out of replications: hired at the best universities, published in the top journals, social prestige and respect. To the point where every novel study is replicated and published at least once. Until we get to that point, there will be far fewer replications than needed for a healthy scientific system.
snowwrestler · a year ago
Replications are not very scientifically useful. If there were flaws in the design of the original experiment, replicating the experiment will also replicate the flaws.

What we should aim for is confirmation: a different experiment that tests the underlying phenomenon that was the subject of the first paper.

breuleux · a year ago
I'd be careful about that. Faking replications is even easier than faking research, so if you place a lot of importance on them, expect the rate of fraud in replication studies to explode.

This is a very difficult problem to solve.

eig · a year ago
The problem with putting the onus on the journals is there is no incentive for them to reward replications. Journals don't make money on replicated results. Customers don't buy the replication paper they just read the abstract to see if it worked or not.

I do like the idea of institutions giving tenure to people with results that have stood the test of time, but again, there is no incentive to do so. Institutions want superstar faculty, they care less about whether the results are true.

The only real incentive that I think can be targeted is still grant money, but I would love to be proved wrong.

insane_dreamer · a year ago
> And then other scientists can pick those up and try to replicate them.

unless there are grants specifically for that purpose, then it's not going to happen; and it's hard to apply for a grant just to replicate someone else's results verbatim. (usually you're testing the theory but with a different experiment and set of data which is much more interesting than simply repeating what they did with their data; in fact replicating it with a different set of data is important in order to see if the results weren't cherry-picked to fit the original dataset).

dyauspitr · a year ago
I think it’s a great idea. It would also give the army of phds an endless stream of real tangible work and a way to quickly make a name for themselves by disproving results.
waveBidder · a year ago
journals have zero incentives to care about any of this.
creer · a year ago
It seems surprisingly hard to counter scientific fraud via a system change. The incentives are messed up all the way around.

If the older author is your advisor and you feel one of their juniors is cutting corners or the elder is cutting corners, you better think twice about what move will help your career. If confirming a recent result counts toward tenure, then presto you have an incentive for fraudulent replication (what's the chance it's incorrect anyway? The original author is a big shot.) Going against the previous acclaimed result takes guts especially in a small field where it might kill your career if YOU got it wrong somehow - So you need to have much stronger results than the original research, and good luck with that. We might say "this is perfect work for aspiring student researchers, and done all the time" - to reimplement some legendary science experiment - but no, not when it's a leading edge poorly understood experiment, and not when that same grad student is already running to try and produce original research themselves.

The big funders might dedicate money to replicated research that everybody is enthusiastic about (before everyone relies on it). But some research takes years to run. Other research is at the edge of what's possible. Other research is led by a big shot nobody dares to take on. Etc etc. So where is the incentive then? The incentive might be to take the money, fully intending to return an inconclusive result.

Some research is taken on now. But only AFTER it's relied on by lots of people. Or much later when better ideas had the time to emerge on how to test the idea more cleverly i.e. cheaper and faster. And that's not great because costly in all the wasted effort by others, based on a fraudulent result. And all the mindshare the bad result now has.

This is messed up.

houston_Euler · a year ago
While Akerlof's Market for Lemons did consider cases where government intervention is necessary to preserve a market, like with health insurance markets (Medicare), he describes the "market for lemons" in the used car market as having been solved by warranties.

If someone brings a plum to a market for lemons, they can distinguish the quality of their product by offering a warranty on its purchase, something that sellers of lemons would be unwilling to do, because they want to pass the cost burden of the lemon onto the purchaser.

The full paper is fairly accessible, and worth a read.

Not sure how this could be applied to academia, one of the problems is that there can be significant gaps between perpetrating fraud and having it discovered, so the violators might still have an incentive to cheat.

pc86 · a year ago
> if everyone is exaggerating their results you have to as well or you will be fired.

Is this really the case the though? Isn't the whole point of tenure (or a big selling point at least) insulating academics from capricious firings?

The big question I have is that there are names on these fraudulent papers, so why are these people still employed? If you generate fictitious data to get published, you should lose any research or teaching job you have, and have to work at McDonald's or a warehouse for the rest of your life. There are plenty of people who want to be professors that we can eliminate the ones who will lie while doing it without losing much (perhaps anything). If your job was funded by taxpayer funds there should be criminal charges associated with willfully and knowingly fabricating data, results, or methods. At that point you're literally lying in order to steal taxpayer funds, it's no different than a city manager embezzling or grabbing a stack of $20 bills out of the cash register.

robotelvis · a year ago
Well you aren’t going to get tenure unless you distort your results and it’s hard to change established habits.

That, and you select for the kind of people who are willing to fake results to further their own careers.

fluidcruft · a year ago
I wonder if there are any studies on whether fraud increased after the Bayh-Dole Act. There's certainly fraud for prestige, that's pretty expected. But mixing in financial benefits increases the reward and brings administrators into play.
hilux · a year ago
> ... as academics become more desperate.

Yes and ... we're already there.

robwwilliams · a year ago
The incentive structures in science has been relatively stable since I entered the field in 1980 (neuroscience, developmental biology, genetics). Quality and quantity of science is extraordinary, but peer review is worse than bad. There are almost no incentives to review the work of your colleagues properly. It does not pay bills and you can make enemies easily.

But there was no golden era of science to look back on. It has always been a wonderful productive mess—much like the rest of life. As least it moves forward—and now exceedingly rapidly.

Almost unbelievably, there are far worse crimes than fraud that we completely ignore.

There are crimes associated with social convention in science of the type discussed by Karl Herrup with respect to 20 years of misguided focus on APP and abeta fragments in Alzheimer’s disease:

https://mitpress.mit.edu/9780262546010/how-not-to-study-a-di...

This could be called the “misdemeanors of scientific social inertia”. Or the “old boys network”.

There is also an invisible but insidious crime of data evaporation. Almost no funders will fund data preservation. Even genomics struggles but is way ahead in biomedical research. Neuroscience is pathetic in this regard (and I chaired the Society for Neuroscience’s Neuroinformatics Committee).

I have a talk on this socio-political crime of data evaporation.

https://www.youtube.com/watch?v=4ZhnXU8gV44&embeds_referring...

abigail95 · a year ago
you don't need regulation for a stable durable goods market. income and credit shocks cause turnover of good quality stock in the secondary market.
dyauspitr · a year ago
It could also have a chilling effect on a lot of breakthrough research. If people are willing to put out what they mostly think is right, it might set back progress decades as well.
Lionga · a year ago
BS governmental desperation to show any "result" (even if it is fake) is what brought us here. As scientist have to show more fake results to get more grants.

Removing the government from science could help, not the other way around.

SV_BubbleTime · a year ago
Good luck with that sentiment here.

People just went through the last five years and will go to their graves defending what they saw first hand. To admit that maybe those moves and omissions weren’t helpful would be to admit their ideology was wrong. And that can not be.

AndrewKemendo · a year ago
If I have learned anything over 40 years, is that the number of people who actually live in a way consistent with hypothesis testing, data collection, evidence evaluation framework required to have scientific confidence in future action or even claims is effectively zero

That includes people who consider themselves professional scientists, PhD‘s authors, leaders etc.

The only people I know who live “scientifically” consistently are people considered “neurodivergent”, along the autism-adhd-odd spectrum, which forces them into creating the type of mechanisms that are actually scientific and as required by their conditions.

Nevertheless, we should expect better from people; and on average need to do better in aligning how they think to how science, when robustly demonstrated, demonstrates with staggering predictability how the world works, compared to all other methods of understanding the universe.

The fact that the people carrying the torch of science don’t live up to the standard is expected - hence peer review.

This is an indictment of the incentives and pace at which bad science is revealed (like in this case) is always too slow, but science is the one place where eventually you’re going to either get exposed as a fraud or never followed in the first place.

There’s no other philosophy that has a higher bar of having to conform with all versions of reality forever.

physPop · a year ago
I would just like to point out the irony of claiming that people live in a way inconsistent with scientific rigour, based solely on personal experience.
AndrewKemendo · a year ago
I think you’re suggesting that I’m making a conclusion without sufficient evidence - hence the “irony”

Recall I’m discussing how people live, namely that they don’t live based on their own claims as to how to live. You’d have to evaluate my behaviors to derive if my claim is ironic.

However I’m Happy to provide that epistemological chain if requested

molave · a year ago
It's disheartening to think that the virtues you are told to have as a kid are considered "weak sauce" once you are an adult.
AndrewKemendo · a year ago
The reason many people hate children is because children are not satisfied with the level of epistemology that most people can provide them, and have no compunction in saying “that answer is unsatisfactory”

Hence why institutional pedagogy is so often rote and has nothing to do with understanding - when we know science of learning says that every human craves understanding (montessori, piaget etc…)

In fact, the shortest way to break the majority of people’s brains is to ask them one of the following questions:

- Can you Explain the reasoning behind your behavior?

- How would you test your hypothesis?

- What led you to the conclusion you just stated?

- Can you clarify the assumptions embedded in your claim?

-Have you evaluated the alternatives to your position?

throwawaysleep · a year ago
But what virtues were rewarded when you were a kid?

My parents and grandparents would say I should be charitable, helpful, and share, but what actually made them happy? Beating other kids.

2OEH8eoCRo0 · a year ago
It's a dilemma- do you want to be virtuous or do you want to maximize your money? I get a sense around here that only the law matters (morals be damned) and we do whatever work pays best.
cdaringe · a year ago
> effectively zero

That feels extreme. Zero is a cold, dark, lonely number. Maybe it’s correct—i dont know. Ive worked on only a couple of projects in this space, and while the incentives certainly involved publishing, i dont feel that it equated to abandoning the SciMethod. Instead, it was the cost to pay for the ability to continue doing science.

Can you really stand by ZERO? How about a 1%. Meet me somewhere above zero, or, if you’d be so kind, make a compelling case why were truly rock bottom.

instakill · a year ago
OP said effectively zero, not zero. These are semantically very different.
booleandilemma · a year ago
It's funny you mention autism, adhd, and similar. It's something I believe the science is quite shaky on.

I've met so many people who self-diagnose with those "conditions", because, I think, they want the world to feel sorry for them, or something.

AndrewKemendo · a year ago
The American version of the cultural Revolution is about to begin, and everybody recognizes that the labor class is coming

So everybody’s trying to align themselves with a victimized group as closely to reality as possible

To such an extent where people are actively making up victimization reasons such that they can find themselves in an affinity group with other victims so they are safe from prosecution during the troubles

rebanevapustus · a year ago
I was the victim of a pretty bizarre super in-your-face academic theft.

Someone snooped a half-finished draft of mine off GitHub and...actually got it published in a real journal: https://forbetterscience.com/2024/05/29/who-are-you-matthew-...

In spite of having a full commit log (with GitHub verified commits!!!) of both the code AND the paper, both arxiv and the journal didn't seem to care or bother at all.

Anyhow, I highly recommend reading the for better science blog. It's incredible how rampant fraud truly is. This applies to multiple nobel prize winners as well. It's nuts.

fluidcruft · a year ago
This guy sounds like someone who would be fun for Javier Leiva's PRETEND podcast [0]. You should reach out to Javier.

[0] https://pretendradio.org/

cdaringe · a year ago
Can you speak more to the “not caring at all” bit? I believe you, but how did you engage them? Did you end up publishing your work eventually?

forbetterscience seems like a good idea, but the writing style, the images, and even the about page gave me pause on if this is a reliable site for trustworthy science commentary

rebanevapustus · a year ago
After almost a year, with an unsurmountable amount of open-source evidence and the thief having had every single paper he has ever written retracted for fraud!! the best the journal did is to add a notice: https://www.mdpi.com/2674-113X/2/3/20

Arxiv cared even less. They allowed the thief to DMCA strike me multiple times. He even managed to take down the real version of the paper by claiming that it was his: https://arxiv.org/abs/2308.04214

> Did you end up publishing your work eventually

No. When I tried to do so, I was actually rejected from a conference because their plagiarism detecting system labelled that I was trying to publish something that already been published (what was stolen).

It was very traumatic.

ProllyInfamous · a year ago
>I believe you, but how did you engage them?

From the article, it seems the engagement came in the form of DMCA take-down requests from university lawyers... which the publication then largely ignored for a considerable period of time (possibly due to counter-DMCA).

In an unrelated scientific field, EE, I recently witnessed how the DMCA process could be used by an "engineer" to silence criticism of his hybrid vehicle battery "upgrades" [2] — similar to Australian company DCS's snafu/lawsuit [1].

Just disgusting, these vultures that know [how to steal/lie/obfuscate] just well-enough to be dangerous... including how to manipulate our DMCA system to their dishonest advantage.

[1] youtube.com/watch?v=_QNMVMlx48E&pp

[2] theautopian.com/toyota-prius-owners-can-soon-swap-tired-old-batteries-for-sodium-ion-cells-but-drama-rages/

tomrod · a year ago
Huh. Sounds like the research needs to be forked to several different hosting providers, preferably ones not based in the US with its insane DCMA laws.
BenFranklin100 · a year ago
As a scientist who has published in the neuroscience space, I don’t what to say other than the incentives in academia are all messed up. Back in the late 90s, NIH made a big push on ‘translational research”, that is, researchers were strongly encouraged to demonstrate their research had immediate, real world benefits or applications. Basic research and the careful, plodding research needed to nail down and really answer a narrow question was discouraged as academic navel-gazing.

On one hand, it seems the push for immediate real world relevance is a good thing. We fund research in order that society will benefit, correct? On the other hand, since publications and ultimately funding decisions are based on demonstrating real world relevance, it’s little surprise scientists are now highly incentivized to hype their research, p-hack their results, or in rare cases, commit outright fraud in an attempt to demonstrate this relevance.

Doing research that has immediate translational benefits is a tall order. As a scientist you might accomplish this feat a few times in your career if you’re lucky. The rest of the corpus of your work should consist of the careful, mundane research the actual translational research will be based upon. Unfortunately it’s hard to get that foundational, basic, research published and funded nowadays, hence the messed-up incentives.

derbOac · a year ago
There's evidence that the turning point was in the 90s but I suspect the real underlying problem is indirect funds as a revenue stream for universities, combined with the imposition of a for-profit business model expectation from politicians at the state and other levels. The expectation changed from "we fund universities to teach and do research" to "universities should generate their own income", which isn't really possible with research, so federal funding filled the gap. This lead to the indirect fund firehose of cash, pyramid scheme labs, and so forth and so on. It sort of became a feedback loop, and now we are where we are today.

Translational research is probably part of it but I think it's part of a broader hype and fad machine tied to medicine, which has its own problems related to rent-seeking, regulatory capture, and monopolies, among other things. It's one giant behemoth of corruption fed by systemic malstructurings, like a biomedical-academic complex of problematic intertwined feedback loops.

I say this as someone whose entire career has very much been part of all of it at some level.

BenFranklin100 · a year ago
Good points, thanks. As I’m sure you’re aware, the indirect rates at some universities are above 90%. That is, for every dollar that directly supports the research, almost another dollar goes to the university for overhead. Much of this overhead is legitimate: facilities and equipment expenses, safety training, etc… but I suspect a decent portion of it goes to administrative bloat, just as much as the education-only part of the university has greatly increased administrative bloat over the last 30-40 years.

Another commentator made a separate point about how professors don’t always get paid a lot, but they make it up in reputation. Ego is a huge motivator for many people, especially academics in my observation. Hubris plays no small part in the hype machine surrounding too many labs.

ubj · a year ago
The Retraction Watch website does a good job of reporting on various cases of retractions and scientific misconduct [1].

Like many others, I hope that a greater focus on reproducibility in academic journals and conferences will help reduce the spread of scientific misconduct and inaccuracy.

[1]: https://retractionwatch.com/