In October last year, Science published a journalistic investigation into quality of peer review in open access journals. The results were sobering. Around 60% of all journals accepted for publication a research paper with the most obvious and basic mistakes – in fact the whole paper, its data, authors and their affiliations were entirely made up by the journalist, John Bohannon, to expose poor peer review.
The article has provoked a lot of media attention as well as a backlash from open-access publishers and supporters, who called it unethical, unsound and even accused the journalist of being racist (for making up authors with African names).
But regardless of the criticisms, the investigation’s surprising findings stand and should be a cause of grave concern for science and science publishing: they show that many – if not most – open access journals do not have an editorial and peer review process strict enough to catch poor research and flawed papers.
Science‘s investigative article intrigued me especially, as I commissioned a similar feature article for the website where I edit news and features, SciDev.Net, which we published last year. I also had the idea of sending out fake and flawed papers to catch ‘bad journals’ who would accept them, but the time and money needed to do this meant we ended up skipping the investigative part, and we based our article only on interviews with people affected.
The key findings were that this was a global problem: some journals prey on researchers, going for their money but not providing proper peer review, and that ‘publish or perish’ culture draws scientists, especially in developing countries, to publish in such journals. Experts suggested investigation and regulation was needed to ensure proper peer review, but there was little indication that this regulation would happen any time soon.
Another reason for not sending out fake papers were concerns over how to do this ethically and legally – in fact, the prospect of being sued by journals or their publishers for even talking about this issue meant that we had to be extra careful and run the article by media lawyers, as well as amend some sections and still accept some risk of being sued. Bohannon, in his recent interview with The Scholarly Kitchen blog, says his investigation, too, was initially held back by an editor who feared a lawsuit.
And here’s the thing: there is a huge number of journals and publishers out there doing a poor job, publishing suspect science and even charging scientists money for it, and yet this is not illegal – and there is no national or international body that can order such journals to shut down. What they do is bad for science, but good for publishers who make money from it and even good for some scientists who choose to publish there so as not to perish, not because they have any significant findings to communicate, and yet doing so does not break any laws.
Journalists wanting to report on this issue fear being sued and are being held back from even investigating the issue. This is why I think Science‘s article is so important: it was brave enough to investigate this issue and expose bad practice even though there was a genuine prospect of a lawsuit. This is what real journalism is about: telling stories that someone, somewhere, does not want you to tell; and seeing it done in science, where we rarely have investigative stories, is especially satisfying.
Even after this expose there may be no consequences for most of the journals and publishers involved. Indeed, apart from InTech’s (Rijeka-based publisher) International Journal of Integrative Medicine, which closed down as I reported in a Retraction Watch blog post, Bohannon says he is not aware of any other closures.
In the legal void in which anyone can set up a ‘scientific journal’ online and start charging scientists for ‘publishing’ it is up to national and international grant-giving bodies and funders to act to exclude journals with poor peer review from being accepted in scientists’ grant, job and promotion applications.
Science’s investigation included most – or all, as Bohannon says – open access publishers that publish in English and in sciences (such as biology, medicine, chemistry), targeting 304 journals, many of which were listed in the Directory of Open Access Journals, and some, tellingly, in Beall’s List of predatory publishers.
This left out thousands of journals that publish in local languages, including many in South-East Europe. Croatia alone has 343 academic journals listed on the central portal of Croatian scientific journals – Hrčak. Most of these are open access and funded by the government, yet scientists often criticise many of them for being a waste of public money and dumpsters for bad science that cannot be published in better international journals. Quality of peer review, especially in domestic languages is also brought into question.
Similarly, in Serbia, SCIndeks lists 411 academic journals. Yet, the Centre for Evaluation in Education and Science, which runs the index together with the National Library of Serbia, found recently that up to 11% of all articles published there contained some sort of plagiarism. The centre later said that “after about one year time we have to admit that the expected response by journal editors is still missing” and itself it only excluded two of the biggest culprits from SCIndeks.
Similarly, my own journalistic investigation into what and how, if at all, plagiarised papers are then retracted from journals in Serbia and Croatia shows a lack of standards and wide variation in retraction practices – often not following internationally accepted guidance, such as those set by COPE.
If journals fail to detect plagiarism, which is a routine procedure these days, one wonders what the state of peer review and detection of other forms of misconduct may be. Indeed, a more recent study by the same centre found what it calls “a citation cartel created for manipulative purposes by two predatory journals” run by a publisher based in Bosnia and Herzegovina, but where many Serbian researchers regularly publish. The cartel part implied that scientists know they are doing a bad thing, paying public money to publish in their friends’ journals, and citing other studies in those journals to artificially boost their impact factors.
What these examples highlight is that by no means has publishing misconduct – or at least suspect practices – bypassed South-East Europe. In fact, small scientific communities, peer review in local languages, and lack of publishing and scientific expertise are all likely to exacerbate the problems in conducting proper peer review in small and local journals.
Indeed, of five journals in the former Yugoslavia which Bohannon targeted, only one – Bosnian Journal of Basic Medical Sciences – recognized the problems with the fake paper and decided to reject it. The other four; International Journal of Integrative Medicine (In Tech, Croatia), Journal of Plant Biology Research (International Network for Applied Sciences and Technology, B&H), Acta Facultatis Medicae Naisensis (Medical Faculty of University of Niš, Serbia), and Macedonian Journal of Medical Sciences (Institute of Immunobiology and Human Genetics in Skopje, Macedonia) all accepted it and if it had not been a journalistic investigation they could all by now have published similar fake papers.
When asked about this case, the editorial offices of the Journal of Plant Biology Research and Acta Facultatis Medicae Naisensis did not reply to my e-mails, which is discouraging. It shows how little transparency some journals are prepared to have in their work, and to what extent they can simply ignore such exposés even conducted by the venerable Science magazine.
The answers I received from the other three journals’ editorial offices shine some light on the issues in the region.
Editor of the Macedonian Journal of Medical Sciences, Professor Mirko Spiroski, PhD, MD, told me his editorial team and peer reviewers did not have expertise in the field the fake article was in, and after seeking ten peer reviews and only receiving one back (a single-line review), they decided to accept the paper nevertheless. InTech basically said they gave their appointed scientific editors, who were not part of the firm, full freedom in peer review and then blamed the mistake on them.
This shows a lack of in-house expertise in some journals and a worrying degree of reliance on outside editors or peer reviewers with little oversight to make decisions on whether or not to publish a paper.
In contrast, the editors of the Bosnian Journal of Basic Medical Sciences, Professor Bakir Mehić, PhD, MD and Amina Valjevac, PhD, MD highlighted the value of an in-house pre-review check of papers, before sending them out to peer reviewers.
As the world continues to debate the merits of, and potentially better ways of doing, peer review (e.g. post-publication peer review, Peerage of Science’s or LIBRE’s community peer review before it reaches journals), we should take care to ensure proper peer review in our journals now. It is not rocket science and it has worked well for the best journals for a long time now.
And national bodies and funders should recognise good practice and reward hard-working and ethical editorial offices to stimulate excellence and better peer review, while at the same time punishing the misconduct and being quicker and more proactive in striking off known off enders off citation indices and lists of journals accepted for official grants and job promotions.
A version of this article was first published as an editorial in November 2013 issue of Bosnian Journal of Basic Medical Sciences