Progress on reproducibility

See allHide authors and affiliations

Science  05 Jan 2018:
Vol. 359, Issue 6371, pp. 9
DOI: 10.1126/science.aar8654

Ideas supported by well-defined and clearly described methods and evidence are one of the cornerstones of science. After several publications indicated that a substantial number of scientific reports may not be readily reproducible, the scientific community and public began engaging in discussions about mechanisms to measure and enhance the reproducibility of scientific projects. In this context, several innovative steps have been taken in recent years. The results of these efforts confirm that improving reproducibility will require persistent and adaptive responses, and as we gain experience, implementation of the best possible practices.


“…improving reproducibility will require…the best possible practices.”

A framework has been developed to promote transparency and openness in scientific publications—the Transparency and Openness Promotion (TOP) guidelines (http://science.sciencemag.org/content/348/6242/1422.full). They cover key principles that apply to many scientific fields, although they were developed primarily by social scientists. The editors at Science have adjusted practices based on these policies and have gained experience with many of these issues. We fully support the principles behind these guidelines, including the centrality and benefits of transparency, as captured in our editorial principle that “all data and materials necessary to understand, assess, and extend the conclusions of the manuscript must be available to any reader” of Science and the Science family of journals. Our editorial policies now contain specific statements for each TOP guideline category. In some cases, we include the possibility of granting specific exceptions but insist that these circumstances be discussed with our editors early in the manuscript evaluation process to allow for thoughtful examination.

Another approach to assess reproducibility involved an experimental program that attempted to replicate selected findings in cancer biology by groups not involved with the original studies (see https://elifesciences.org/collections/9b1e83d1/reproducibility-project-cancer-biology). Although some findings were largely reproduced, in at least one case (which was published in Science), the key finding was not. Yet, the initial results have been utilized and extended in published studies from several other laboratories. This case reinforces the notion that reproducibility, certainly in cancer biology, is quite nuanced, and considerable care must be taken in evaluating both initial reports and reported attempts at extension and replication. Clear description of experimental details is essential to facilitate these efforts. The increased use of preprint servers such as bioRxiv by the biological and biomedical communities may play a role in facilitating communication of successful and unsuccessful replication results.

Over the past year, we have retracted three papers previously published in Science. The circumstances of these retractions highlight some of the challenges connected to reproducibility policies. In one case, the authors failed to comply with an agreement to post the data underlying their study. Subsequent investigations concluded that one of the authors did not conduct the experiments as described and fabricated data. Here, the lack of compliance with the data-posting policy was associated with a much deeper issue and highlights one of the benefits of policies regarding data transparency. In a second case, some of the authors of a paper requested retraction after they could not reproduce the previously published results. Because all authors of the original paper did not agree with this conclusion, they decided to attempt additional experiments to try to resolve the issues. These reproducibility experiments did not conclusively confirm the original results, and the editors agreed that the paper should be retracted. This case again reveals some of the subtlety associated with reproducibility. In the final case, the authors retracted a paper over extensive and incompletely described variations in image processing. This emphasizes the importance of accurately presented primary data.

As this new year moves forward, the editors of Science hope for continued progress toward strong policies and cultural adjustments across research ecosystems that will facilitate greater transparency, research reproducibility, and trust in the robustness and self-correcting nature of scientific results.

View Abstract

Stay Connected to Science

Navigate This Article