Peer review, scientific integrity and community; a comment


A rite of passage for many scientists is their elevation to some kind of editorial board, usually associated with a scientific journal.  This is where they get to review the work of other scientists and become part of the decision-making process that results in publication – or rejection.  It is an excellent means of extending one’s network of people who are interested in the same discipline.  It is always a learning experience, no matter how many papers one reviews or edits over a lifetime; new ideas, new data, new methods, new ways of expression.  Admittedly, the task of reviewing a paper can arrive on your desk at precisely the wrong time.  But a good reviewer will understand that there is always a quid pro quo; your own paper under review may arrive on someone’s desk at a time most inconvenient for them.  So you do the job anyway.

Peer review is every scientist’s responsibility.  The integrity of science, as a purveyor of understanding about how the universe works, depends on the dissemination of ideas and information. A new discovery will not mean a great deal unless it is presented to the scientific community, in fact all communities, in a manner that allows others to evaluate it in terms of the logic of its arguments, the methods used to generate data (for example, can experimental results be duplicated?), how this data was evaluated (maths, statistics), and whether the conclusions are justified by the data or just wishful thinking.  In the first instance, this is the job of reviewers and editors, and once the paper has been published, it is the job for the rest of the community to evaluate its veracity.

Peer review, by and large, does work, at least for journals and periodicals that are well known.  Fraudulent and plagiarised works are published, slipping through the cracks, leaving editors and reviewers red-faced (note however that authors commonly do not know who has reviewed their submitted manuscript). There have been some notorious examples of scientific fraud; Piltdown Man, Berringer’s Lying Stones, MMR vaccinations and autism – the list continues.  But we know these are frauds because the peer review system caught up with them.

There are now more than 28,000 science journals and each of these publishes dozens of papers every year. Almost one million science papers were published in 2009; the number today significantly exceeds this mark. These numbers only apply to actual journals; not included in the count are the myriad websites that are not vetted and have zero accountability.

Accompanying the headlong rush by reputable publishers to produce digital copy, come the predators and fraudsters; journals that profess stringent rules of review and manuscript acceptance, but practice exactly the opposite.  There are on-line journals out there where one can submit a manuscript that is complete gibberish, containing cut-and-pasted paragraphs from pretty-well anything you like, and actually have it published.  The Guardian Newspaper reported (October, 2016) that a New Zealand professor of Human Technology received an invitation to submit a paper to a nuclear physics conference in the US.  Bemused, and knowing little about nuclear physics, he decided to test the system by submitting a paper in which he created the first word of a sentence, such as Nuclear, and then used an autocomplete function to complete the sentence.  His paper was accepted.

The advent of digital publication has seen a surge in fraudulent submissions, especially in medical journals; one analysis puts the rise in fraud at about 1700% since 2004 (Wall Street Journal). The internet has opened the aether to ingenious methods of fooling editors and publishers. These shysters will publish invented data, manipulated data, plagiarised data, you name it.  The peer review process does identify fraud in many of these submissions, but not all.  One South Korean medical researcher created false email addresses through which he could arrange to  peer review his own manuscripts. This guy was able to publish 28 articles before he was discovered; these papers were withdrawn from the journals.  Another, at a Taiwan University, also utilized the ease with which false email addresses can be created, and got away with 60 publications before he too was nabbed.

The incidence of fraud has become a serious problem, in part because it creates a huge amount of work for honest publishers and editors, but more importantly, because it is a direct attack on the integrity of science.  Reputable journals are constantly on the lookout for fraud and plagiarism; it is a constant battle. Peer review is an important part of this investigative process.  But in some ways the peer review safety net is a bit like the ambulance at the bottom of the cliff.  There needs to be much greater accountability from the institutions where these fraudsters work. The Geological Survey of Canada, where I worked for many years, required internal reviews by two or three qualified people before a manuscript was sent for external publication; the internal reviews were always stringent.

Peer review, generally regarded as the gold standard of scientific honesty and integrity, does work although it may not be quite as glossy as it once was.  Peers, like authors, are people so it should be no surprise that the vagaries of humanity sometimes interfere with the steady flow of truth.  But peer review is not just about honesty, it is also about the engagement of scientists with their communities, and the protection of trust that broader communities should have in science.  Unfortunately for some, the gloss has also come off this sense of trust.  We have a lot of work to do.


Leave a Reply