A picture is worth a thousand words. Unfortunately, those words sometimes speak lies.
A new analysis of almost 1,000 scientific papers has revealed a shocking number contained inappropriately duplicated images – and while many of these resulted from honest mistakes, about one in ten of the papers caught out ended up being retracted.
That's a pretty alarming statistic. But either way you look at it, the findings (which have not yet been peer-reviewed) suggest scientific publishing has a major quality control problem in terms of properly vetting images in papers – and that's when the duplications don't represent something even more sinister.
"At one end of the spectrum, inappropriate image duplications caused by simple errors in constructing figures raise concerns about the attention given to the preparation and analysis of data," the authors of the analysis write.
"While at the other end of the spectrum, problems resulting from deliberate image manipulation and fabrication indicate misconduct."
Starting a new set of papers to scan for duplicated figures or parts of figures. We start off with paper #4. New set, same journal. pic.twitter.com/kFulJhtVSx
— Elisabeth Bik (@MicrobiomDigest) July 2, 2018
The team, led by microbiologist and science editor Elisabeth Bik of biotech company uBiome, sifted through 960 papers published in the journal Molecular and Cellular Biology (MCB) from 2009 to 2016.
Using software to double-check any images that looked potentially doctored, the researchers ultimately discovered 59 (6.1 percent) of the articles contained inappropriately duplicated images.
Sometimes these duplications repeated the same exact panel twice in a paper, or sometimes the image might be shifted or rotated. In other circumstances, anomalies could be detected within a single image.
Given the blotchy, ambiguous, and specialised subject matter of many scientific paper images and graphs, it's perhaps not surprising that this dubious content went under the radar until now.
Eventually, once the team flagged the duplications with the editor-in-chief of the journal, it resulted in each papers' authors being contacted about the discrepancies – which led to 42 corrections being made, 12 instances in which no action was taken (for various reasons, including lab closures and amount of time passed), and five retractions.
Paper #8 has this blot - left and right columns represent different cell lines, but the mTOR blots look very similar. They are cropped differently and have different grey scales, but their shape is very similar. pic.twitter.com/D7dk0uy7Op
— Elisabeth Bik (@MicrobiomDigest) July 2, 2018
Those five retractions are the real sticking point here, because while we're only talking about five retractions all up from this limited sample, there's clearly a bigger problem that's threatening the reliability and credibility of published science as a whole.
"Nevertheless, ~10 percent of papers with inappropriate image duplications in MCB were retracted," the authors explain.
"If this proportion is representative, then as many as 35,000 papers in the literature are candidates for retraction due to image duplication."
That's a pretty breathtaking figure, recalling the gravity of previous research by Bik which found problematic images haunted some 4 percent of biomedical papers.
So, what's the solution? According to the team, their analysis isn't a witch hunt, but it is supposed to act as a wake-up call to this invisible, endemic oversight.
"Studies like ours are … meant to raise awareness among editors and peer reviewers," Bik told Retraction Watch.
"Catching these errors before publication is a much better strategy than after publication."
From that same paper, this flow cytometry figure appears to show 2 duplicated pairs of panels. Is this a simple mistake? Maybe - but how to explain the different percentages? pic.twitter.com/684YwZycDY
— Elisabeth Bik (@MicrobiomDigest) July 2, 2018
In a pilot program at MCB as part of the research, the team found it took journal staff six hours to deal with problematic images once they had been published, but only 30 minutes to address the images before publication.
In other words, it won't only save science's reputation to tackle this problem head-on before papers get published – it's also a more efficient use of editorial resources.
Giving staff training on how to spot duplicated images could be a big help, as would ensuring researchers enlist extra colleagues to help them vet or prepare images for papers, to cut down on mistakes.
These techniques will help mitigate the problem, and they'll need to, too – because the technological sophistication of image forgery isn't going to stand still, the researchers warn.
"We are just starting to recognise these problems," Bik said.
"I also expect, unfortunately, that people who really want to commit science misconduct will get better at photoshopping and generate images that cannot be recognised as fake using the human eye."
The pre-print findings are reported in bioRxiv.