Nineteen journals shut down by Wiley following delisting and paper mill problems | News


Publishing giant Wiley has closed 19 journals, once again putting concerns around fake studies front and centre.
The journals that were withdrawn were all owned by Hindawi, a company Wiley bought in January 2021, that was later discovered to have a paper mill problem at some titles. Paper mills produce phoney manuscripts for publication, whose authorship can then be sold.
Wiley stated that the removal of the titles was normal business practice and that the journals, which included the Journal of Nanomaterials, were pulled as they no longer served the needs of their communities.
Eleven of the shuttered journals had been delisted from Web of Science last year. Eight others were unable to publish a viable number of articles ‘or because we have stronger similar journals elsewhere in the portfolio’, according to a Wiley spokesperson.
The scientific publisher drew a line between these closures and moves it made in May 2023 to close four journals. Hindawi described these four journals as heavily compromised by paper mills and said it would continue to retract content from these journals after closure.
‘Paper mills are illicit businesses that engage in a variety of questionable and irresponsible business practices from fabricating data, to fabricating papers and selling authorship,’ says Sarah Eaton, an expert on academic ethics at the University of Calgary, Canada. ‘They are focused on generating profit at scale and they do not care about science.’
Of the 19 journals shuttered, a telecommunications journal and a complementary and alternative medicine journal each had over 500 retractions on the Retraction Watch database, while an engineering journal had over 190 retractions. By contrast, high-profile chemistry journals such as Angewandte Chemie and Chemical Science have less than 30 retractions between them. ‘We don’t expect a reputable journal to be associated with massive amounts of low-quality papers that are later retracted,’ says Abalkina.
‘Several of these journals were flooded with retractions, which means they were massively attacked by paper mills,’ says Anna Abalkina, a research fellow at the Free University of Berlin, who has studied corruption in higher education in Eastern Europe and Central Asia.
It has been estimated that over 400,000 research articles have been published that ‘show strong similarities to known studies produced by paper mills’, with around 70,000 published last year.
Blacklisted
In Norway, the National Board of Scholarly Publishing helped delivered a strategy document to government that highlighted the need for a list of accepted journals, unacceptable journals and ‘journals under discussion’.
‘The list does not surprise me,’ says Vidar Røeggen, who adds that the Journal of Nanomaterials was on the ‘under discussion’ list. ‘I’m glad to see Wiley acting in this way. This shows that they are taking the problem seriously. It’s an example for others to follow.’
He says many publishers are now wary of special issues, which attract paper mills, ‘and serious publishers will be active, testing out new tools for screening and detecting problematic papers’.
Abalkina says some paper mills target ‘reputable publishers like Wiley, Elsevier, Taylor and Francis, Sage Publications and Nature Springer’, while others target low-quality and predatory journals. ‘They are interested in all types of journals indexed in Scopus and Web of Science because of how countries evaluate research,’ she adds.
Last year the Commission on Publishing Ethics issued a position statement on paper mills urging immediate action, with a dozen organisations involved in scholarly publishing describing paper mills as a danger to the integrity of the scholarly record.
A report from Wiley noted that it was able to systematically pinpoint guest editors who were responsible for handling multiple retracted papers. But it also observed that AI tools can be exploited to generate fabricated data, manipulate images and devise entirely fraudulent studies.
Torturing language
Computer scientist Guillaume Cabanac at the University of Toulouse has developed a Problematic Paper Screener to look for what he calls ‘tortured phrases’ in scientific publications. Examples include ‘mind tumour’ instead of brain tumour; ‘signal-to-clamour ratio’, rather than signal-to-noise ratio; and ‘lactose bigotry’ instead of lactose intolerance. This comes from copying and pasting and paraphrasing to hide plagiarised text, says Cabanac.
He has discovered around 14,000 papers with these tortured phrases in almost all the big scientific publishers. If an author or a third party uses these tricks to hide plagiarism, then it raises serious questions about the experiments too.
Cabanac’s discoveries stray into the ridiculous. One paper had an AI-generated image of a rat that apparently had testicles bigger than the animal’s head. He names and shames publishers on social media to encourage them to retract dubious papers.
🥴 Apparently the ‘Chimeric Rat’ Review Article passed peer review (as implements by Frontiers in Cell and Developmental Biology) with at least 2 reviewers, see the comments on the right hand side… https://t.co/v97FuUfMnk (PDF: https://t.co/is2iRePQDG) pic.twitter.com/pXLVfBqonI— Guillaume Cabanac ⟨here and elsewhere⟩ (@gcabanac) February 16, 2024
His screening algorithms have been taken up by multiple publishers. In late 2023, Clarivate outlined ramped-up evaluation procedures for Web of Science, taking into account extreme levels of authorship, anomalously high self-citation and issues such as tortured phrases and spurious references. IOP Publishing has also used the screener to find dozens of dubious papers.
Nonetheless, the incentives favour fraud. ‘We are volunteers and we have our side jobs,’ says Abalkina. ‘While paper mills can make millions in profit.’
Eaton says that it is not just publishers who need to take responsibility. Research institutions ‘contribute to the problem by continuing to reward number of publications per year over the quality of those publications’, she points out.
‘The higher ed metrics systems that factor in numbers of publications, numbers of citations and h-index numbers are also harmful to the overall focus on the quality of science. We need to focus more on the quality of scientific outputs, rather than the quantity.’

Hot Topics

Related Articles