Tuesday, May 14, 2024

Flood of Fake Science Forces Multiple Journal Closures

WSJ article:

https://www.wsj.com/science/academic-studies-research-paper-mills-journals-publishing-f5a3d4bc?st=u0ripskhb4igz5k&reflink=desktopwebshare_permalink

Flood of Fake Science Forces Multiple Journal Closures

Wiley to shutter 19 more journals, some tainted by fraud

 Excerpts:

Fake studies have flooded the publishers of top scientific journals, leading to thousands of retractions and millions of dollars in lost revenue. The biggest hit has come to Wiley, a 217-year-old publisher based in Hoboken, N.J., which Tuesday will announce that it is closing 19 journals, some of which were infected by large-scale research fraud. 

In the past two years, Wiley has retracted more than 11,300 papers that appeared compromised, according to a spokesperson, and closed four journals. It isn’t alone: At least two other publishers have retracted hundreds of suspect papers each. Several others have pulled smaller clusters of bad papers....

The sources of the fake science are “paper mills”—businesses or individuals that, for a price, will list a scientist as an author of a wholly or partially fabricated paper. The mill then submits the work, generally avoiding the most prestigious journals in favor of publications such as one-off special editions that might not undergo as thorough a review and where they have a better chance of getting bogus work published. 

World-over, scientists are under pressure to publish in peer-reviewed journals—sometimes to win grants, other times as conditions for promotions. ...

Scientific papers typically include citations that acknowledge work that informed the research, but the suspect papers included lists of irrelevant references. Multiple papers included technical-sounding passages inserted midway through, what Bishop called an “AI gobbledygook sandwich.” Nearly identical contact emails in one cluster of studies were all registered to a university in China where few if any of the authors were based. It appeared that all came from the same source....

Another data scientist, Adam Day, built “The Papermill Alarm,” a tool that uses large language models to spot signs of trouble in an article’s metadata, such as multiple suspect papers citing each other or using similar templates and simply altering minor experimental details. 

Monday, May 6, 2024

More on the failure of "peer reviews"

When articles don't include the code and data required to figure out if the work is legit, they can't be trusted--or even peer reviewed.

Reminiscent of COVID: https://x.com/wideawake_media/status/1787472500776943990

_____

Crémieux @cremieuxrecueil 9h

In short, it's hard to tell what scientific work is 'legit' because so little of it includes the code and data required to figure that out.

Increasing code and data availability might be an important part of distancing ourselves from the replication crisis.


Another publication looked at 67 journals and recorded: Across economics, political science, sociology, psychology, and general science journals, code and data requirements are rare, and checking results was only a common occurrence in political science.
Image
3
203

Some economists tried to reproduce the results of 67 economics papers and they pretty much couldn't do it: Even with help from authors, only half of papers ended up being reproducible, and this was still a problem at journals with required reporting of code and data.
Image
64
3K
742



An earlier paper looked at the evidence for reproducibility in international development impact evaluations. Similar story there: too much data was simply unavailable and some was discrepant.
Image