The Reproducibility Initiative, a project we’ve written about before, has reached a major milestone. They have been awarded $1.3M in funding from the Center for Open Science and the Laura and John Arnold Foundation to replicate 50 key findings in cancer biology. Mendeley has supported the initiative by helping to design the selection process for papers, using Mendeley readership in addition to traditional citation measures.
We try to keep ahead of the issues in research, pushing for open access and better tools for researchers, and over the past few years, from the Stapel affair in psychology to the reports from Bayer and Amgen reports of their failures to replicate most of the high-impact biomedical research they have studied in-house, reproducibility has emerged as a key issue. This comes as no surprise to us, and in fact, John Ioannidis’ paper “Why Most Published Research Findings Are False” has been one of the all-time most highly read papers on Mendeley.
So we’re super excited for the Reproducibility Initiative and for the change this can bring to research, and I’m personally proud of my work to support the Initiative. I’m also stoked that Mendeley readership data has become good enough quality to be used in a project of this caliber. There have been some criticisms of the project, with folks saying that some research is inherently not reproducible and that 50 studies is just a drop in the bucket, which is all true, but the Initiative will be working closely with the authors of the selected papers (full list to be released soon), focusing on the most high impact, hypothesis-driven work, and using independent third-party expert labs to do the work, so if they can’t replicate the findings, there’s a good chance no one else will be able to either. The data derived from these and any future replications will be analyzed with a view towards finding ways to surface the most robust work. Wouldn’t it be great if you could get a little recognition for taking the time to do careful and reproducible work?
Here’s the official announcement:
PALO ALTO, Calif. — October 16, 2013 — The Center for Open Science announced today that it would designate $1.3M of funding from the Laura and John Arnold Foundation towards the Reproducibility Initiative to independently validate 50 landmark cancer biology studies. The 50 chosen studies are among the highest impact studies in the field over the period of 2010 to 2012, and systematic validation could be crucial to developing future cancer drugs.
“The lack of reproducibility in cancer studies is a major obstacle in the development of viable therapies to cure cancer,” said Dr. Elizabeth Iorns, co-director of the Reproducibility Initiative. “The Reproducibility Initiative hopes to transform the scientific process by enabling researchers to verify key scientific findings and incentivizing scientific replication. The funding will be instrumental in not only verifying landmark cancer studies, but also helping to institutionalize scientific replication.”
The Reproducibility Initiative was launched by several prominent scientific journals and organizations (Science Exchange, Mendeley, PLOS, and figshare) last year in response to revelations from the pharmaceutical industry that more than 70 percent of published cancer research cannot be reproduced, thus stifling the development of effective new therapies. The Reproducibility Initiative intends to identify and reward high quality reproducible research through
independent validation of key experimental results. The Reproducibility Initiative has already begun progress in the effort to improve reproducibility through a partnership with antibodies-online to independently validate thousands of commercial antibodies.
With this funding, the Reproducibility Initiative develops an integrated collaboration with the similarly named Reproducibility Project, already supported by the Center for Open Science. The Reproducibility Project is a crowd-sourced effort by researchers to identify the predictors of reproducibility in a large sample of published studies in psychological science. “The integration of these two projects is an opportunity to understand and address reproducibility challenges that are shared across scientific disciplines,” said Brian Nosek, director of the Center for Open Science.
The key experimental findings from each cancer study will be replicated by experts from the Science Exchange according to best practices for replication established by the Center for Open Science through the Center’s Open Science Framework, and the impact of the replications will be tracked on Mendeley’s research analytics platform. All of the ultimate publications and data will be
freely available online, providing the first publicly available complete dataset of replicated biomedical research and representing a major advancement in the study of reproducibility of research.
The Center for Open Science was founded by Nosek and Jeffrey Spies to increase openness, integrity, and reproducibility of scientific research. The Center for Open Science will administer the funds as part of their mission to incentivize the replication of important scientific studies.
Core funding for the Center for Open Science and its replication initiatives comes from the Laura and John Arnold Foundation, which funds projects to promote transformational change. Stuart Buck, Director of Research at the Laura and John Arnold Foundation, stated that the Reproducibility Initiative “may eventually serve as a model for other funding agencies and patient groups, with the ultimate goal of improving cancer treatment through more rigorous and reliable science.”
“This project is key to solving an issue that has plagued scientific research for years,” said Dr. William Gunn, co-director of the Reproducibility Initiative. “The funding is a game-changer in our mission to improve scientific reproducibility.”
You can join the Reproducibility Initiative by emailing document.write(mailto(“grnz@ercebqhpvovyvglvavgvngvir.bet”));team@reproducibilityinitiative.orgteam@reprod ucibilityin itiative.or g
Studies can be submitted for independent validation by registering them at https://www.scienceexchange.com/validation