Tech has made it easier to fake scientific results. Is a cultural shift required to fix the problem?

Paper retractions and image duplications are a symptom of a much larger problem

Published January 25, 2020 12:30PM (EST)

 (Getty/Totojang)
(Getty/Totojang)

This story originally appeared on Massive Science, an editorial partner site that publishes science stories by scientists. Subscribe to their newsletter to get even more science sent straight to you.

Cases of scientific misconduct are on the rise. For every 10,000 papers on PubMed, 2.5 are retracted, with more than half of these retractions attributed to scientific misconduct, which includes mismanagement of data and plagiarism.

"Papers from twenty or thirty years ago were fairly simple – they [had] maybe one or two photos," says Elisabeth Bik, a microbiologist who now works as a scientific integrity consultant. "That's around the time that I did my PhD. If we wanted to submit papers with photos, we had to make an actual appointment with a photographer! It was very hard to fake anything."

Tasks like photographing results and constructing academic figures were once specialized, requiring designated experts who had nothing to do with the data collection process. That's not the case in the 21st century. As technology has advanced, not only has the amount of data increased exponentially, but so has our ability to record and report this data. With more people competing for fewer academic jobs, scientists are constantly under pressure to acquire more data, publish in high impact journals, and secure more external funding. 

One study from Arizona State University found that the mounting professional pressure and the low chances of getting caught are some of the reasons that scientific misconduct is so prevalent. Coupled with availability of image editing tools and the ease of cutting-and-pasting phrases, it is also a lot less challenging to misrepresent findings.

In 2016, Bik and colleagues analyzed over 20,000 papers from 40 biomedical research journals, finding that one in 25 images had evidence of image duplication. 6.1% of papers from the Molecular and Cell Biology journal alone showed signs of inappropriate alterations.

One of the organizations looking for solutions to this growing issue of scientific misconduct is the International Life Sciences Institute (ILSI). Founded in 1978, the ILSI is an organization of scientists working in food safety and nutritional science. One of their major aims is to ensure scientific integrity in nutrition-related research, especially since research findings in this field often inform public health policy decisions. To find a solution, ILSI's North American branch (ILSI North America) co-founded the Scientific Integrity Consortium to evaluate the extent of scientific misconduct, and to broaden the scope of this conversation beyond food science. In 2019, the consortium published their findings, which included guidelines on how to define research misconduct and detrimental research practices, in addition to a comprehensive list of recommendations to tackle the issue.

This includes encouraging scientists to connect their work to a broader social context and to consider the implications of their work for the general public. To nurture this culture, a number of steps need to be taken by both institutions and scientists. Institutions must provide the necessary educational resources, infrastructure and quality maintenance support for equipment and research, alongside better training and standardized universal expectations for integrity. On the other hand, scientists need to follow the given standardized procedures for research design and publication, engage in transparency and honest communication, and be mindful of the ethical implications of their work.

The committee acknowledged that the training received by scientists is insufficient to help them deal with the different stages of their career, and that the "publish-or-perish" mentality only makes it harder to create the cultural shift they recommend. For example, practices like "p-hacking", where an individual selectively analyzes data to create significance from non-significant results, are more likely to occur under the pressure to secure funding.

Some ways to foster this change are to provide better ethics training for scientists, and to incorporate a scientific integrity "checklist" to be followed by scientists. The proposed checklist would entail best practices to be followed for designing studies and writing papers, such as ensuring that methods are reproducible, and that ethical data analysis standards are upheld. On the institutional level, journals should be encouraged to value rigorous research that may not always yield conventionally "exciting" findings. Currently, many journals prefer to publish positive findings as opposed to negative or null findings, which are equally important for scientific progress. One way that the consortium would like to nurture this culture is by changing the vocabulary we use to communicate these findings – instead of referring to results as "positive" or "negative," they suggest terms such as "anticipated"and "unanticipated" findings. 

Another important point made by the consortium was to further emphasize the role and importance of mentorship. Cases where scientific misconduct occurs can put students in a difficult position. Trainees often have to deal with the dilemma of reporting the misconduct at the risk of potentially losing their positions. This normalizes scientific misconduct, and can lead to further instances of academic dishonesty. Once misconduct is caught, trainees can also suffer the eventual backlash, including difficulty in finding future positions.

Not only can open science reduce the chances of misconduct, but it can be an excellent resource for fellow scientists, and a way to increase public trust in the scientific process. In line with the open science efforts, some scientists are suggesting that individuals should be able to offer post-publication comments on papers, as opposed to having a static review process that ends after publication. This would allow every reader to issue comments and feedback, keeping the paper under constant "live" review. While some journals — such as eLife — currently allow post-publication feedback, platforms like PubPeer allow scientists to search and leave comments on papers from any journal.

When asked about what policy changes she would like to see to reduce scientific misconduct, Bik highlighted the importance of open communication and clear guidelines. "Every journal and every institute should have a contact person that anyone can contact – I cannot report cases if I can't find e-mail addresses," says Bik. "There should be guidelines for when a paper should be retracted, versus when a paper should be corrected."

In addition, it's worth noting that many errors are honest errors, that don't necessarily deserve a paper retraction. Bik says that "90% of the scientists are very honest. We all make errors – the bigger our datasets get, the harder it becomes. There is so much data now!"

recent study following 12 retracted publications found that out of the 68 papers they had been cited in, only one had been re-assessed and corrected to account for the retraction. Even after retraction, findings from flawed papers can live on. One example is the frenzy that continues to surround the 1998 paper that falsely claimed a link between vaccines and autism, despite its retraction in 2010.

Furthermore, the  social stigma that follows a retraction due to scientific misconduct can actually spill over to collaborators who had nothing to do with the misconduct. Former collaborators of dishonest scientists can have an 8-9% drop in paper citations. Sadly, this means that potential whistleblowers might be less likely to report cases of misconduct in fear of jeopardizing their own careers by association with the perpetrator.

"I hope I can make people aware how much damage it can do to fake results – it can lead other people to pursue results that did not happen," says Bik. 

Knowing the consequences of scientific misconduct, Bik quit her full-time job to tackle this problem as a scientific integrity consultant. "My mission is to make sure that science is reliable."

 


By Bhavya Singh

MORE FROM Bhavya Singh


Related Topics ------------------------------------------

Massive Science Science Scientific Misconduct Scientific Papers Technology