Software makes it easier for journals to spot image manipulation
BMJ 2007; 334 doi: https://doi.org/10.1136/bmj.39160.666204.BD (Published 22 March 2007) Cite this as: BMJ 2007;334:607The digital age has made it easier to enhance the results of scientific studies, but it is also starting to provide means to counter the trend, delegates at an international seminar on research misconduct heard last week.
Mike Rosner, executive editor of the Journal of Cell Biology, said that “image manipulation” had become part of everyday culture, especially on the internet, and the widespread availability of computer software, such as Photoshop, had made it easy to do.
Dr Rosner was speaking at the annual seminar of the Committee on Publication Ethics (COPE) in London, attended by more than 85 scientific journal editors and publishers.
No comparative figures for image manipulation were available, he said. “But the frequency with which we are seeing image manipulation by authors speaks to some misunderstanding of the line between acceptable and unacceptable,” he suggested.
The phenomenon had prompted the journal to issue strict guidelines for digital images to clarify the difference.
“Scientists expect and assume basic scientific honesty—that is, that the data presented are accurate representations of what was actually observed,” he said.
Most cases were categorised as “inappropriate,” where the manipulation violated the journal's guidelines but did not affect interpretation of the data. But it was a sizeable problem.
“At least 25% of all accepted manuscripts have at least one figure that has to be remade,” he said.
Deliberate manipulation of an image to make the results appear better than they were was much rarer, occurring in 1% of all papers the journal accepted, he added.
“If you think of the possibility that of all the publications in PubMed, 1% of them have some fraudulent data in them, I personally find that a worry,” he said.
But increasing electronic submission and processing of articles for publication would help editors to spot telltale changes in contrast and background, which would not be visible on a print-out, he said.
“The irony is that although Photoshop makes it easier for an author to manipulate an image, it also makes it easier for an editor to detect,” he said, adding that specialist detection software was also now being developed.
Delegates also heard how a range of programmes can be used to spot plagiarism in articles, by analysing subtleties in tone and language to highlight similarities and inconsistencies in the text.
Sunil Moreker, editor of the Journal of the Bombay Ophthalmology Association, said that the software had helped him identify 13 plagiarised articles in two years, compared with only two in the previous two years.
Publicising its use in the journal, which was important to avoid potential legal repercussions, he said, had also deterred would be plagiarists. After running the system for two years, and making it known to authors, by the third year, no articles containing plagiarism had been submitted.
“The problem for editors is to find the additional money and staff to resource these programmes,” said Harvey Marcovitch, chairman of the committee. “Organisations like COPE need to raise awareness with medical school deans and teach medical students about research misconduct,” he added.