Recently UC Libraries and the Graduate School hosted the Center for Open Science for two workshops on research reproducibility. The Center for Open Science, a non-for-profit based in Charlotteville, Va. promotes openess, integrity and transparency in research. Ian Sullivan of the COS facilitied the workshop and worked with researchers to address several types of repoducibility issues in research- Computational, Methodological and Results replicability.
Computational reproducibility means that given the data and code/analysis methods used, someone else could reproduce the graphs and calculations in your paper or report. Methodological reproducibility means that someone else could follow your protocols and rerun the experperiment or research again and get the same results as you did. And results replicability means that with new data and using your methods and analysis, someone else can come to the same conclusion as you did.
The workshop introduced participants to the Open Science Framework (OSF) which is a project management tool. The OSF is freely available to researchers and helps researchers document their research process and materials. Reproducibility relies on having access to the research process, the data and analysis scripts that unlie a research project. Setting up a well organized and documented project in the OSF is a great way to achieve reproducilbility in research.
At UC, we have set up our own portal to the Open Science Framework so UC Community members can easily establish an account and login using single sign on authentification. This portal is available at https://osf.uc.edu.
The Center for Open Science is active in many projects in support of research reproducibility. There are several large projects housed within the OSF, such as OSF for Meetings, OSF for Registries and OSF for Preprints. A registration is when you document your research plan publicly and immutably in a repository. A preprint is the version of your article before it goes through peer-review. Making preprints and research project registrations available brings more researchers into the peer review process. In addition to developing the OSF, the Center for Open Science provides statistical consulting for researchers and the Center is working with the Association of Research Libraries to develop the SHARE project. The goal of the SHARE project is to create a comprehensive and searchable dataset of metadata about research and scholarly activities and resources across their life cycle. The Center for Open Science promotes and provides training on TOP guidelines. TOP stands for Transparency and Openness Promotion and spells out standards for Data Citation | Data, Materials, and Code Transparency | Design and Analysis | Preregistration | Replication. Journals, Publishers, and funders are encouraged to adopt the TOP guidelines to direct researchers on how to implement these best practices, what information and documentation to include in their articles for best sharing their research data and process. By sharing data and the process to generate the data, the growth rate of scientific knowledge will increase. Additionally data sharing accelerates the self-correcting process of science and leads to more reproducible research. I recommend reading this recent editorial by Dr. Theresa Culley (UC Department of Biological Sciences) entitled “The Frontier of Data Discoverability: Why We Need to Share Our Data 1 ” which outlines an excellent case for the need for sharing reproducible data. Dr. Culley is Editor-in-Chief for the Journal Applications in Plant Sciences (APPS) which adopted the TOP guidelines.