Our lab is committed to practicing and advocating for open science and transparency in research.

But first, what is open science?

To learn more, read the article below authored by Dr. Standen about understanding and implementing open science practices in your research.

This article was originally published as a Psychological Science Agenda newsletter article for the American Psychological Association in November 2019. APA has since discontinued the newsletter, so we present the article text here for continued use. To view the original article, please click here.

FROM THE SCIENCE STUDENT COUNCIL

Open science, pre-registration, and striving for better research practices

Practical recommendations for getting started.

By Erin C. Standen

November 2019

What is open science, and why do we need it?

“Open science” is a broad term that encompasses a variety of efforts to make science more reproducible, accessible, transparent and rigorous. These principles have always been crucial to scientific advancement; however, the recent push for adopting open science practices has been fueled by discussions of a “replication crisis,” or as some have called it, the “credibility revolution” (Pashler and Wagenmakers, 2012; Vazire, 208).

Both reproducibility and replicability are emphasized in discussions of open science, and the difference between them is important to understand. A finding is "reproducible" if another researcher could produce the same results as the original researcher when using the same exact data, computational steps, code and conditions of analysis (National Academies of Sciences, Engineering and Medicine, 2019). To achieve reproducibility, researchers must thoroughly report their computational steps and methods. A finding is "replicable" if separate studies testing the same scientific question have consistent results. Replicability depends on the existence of the phenomenon in question, so it is not necessarily under the researcher's control.

Many factors have contributed to the lack of replicability and reproducibility in science, some of which lie within individual researchers' control (e.g., incomplete reporting, p-hacking), and others that exist at the systemic level (e.g., publication bias). Advocates of open science practices have identified and developed various strategies to mitigate these threats to reproducibility and to encourage the responsible conduct of research (Munafò et al., 2017). In particular, prominent voices in the field have called for researchers to engage in practices that bolster the transparency of their work (Munafò et al., 2017, Nosek et al., 2019).

Increasing the transparency of psychological science has several clear benefits. For one, thorough reporting of methodology allows researchers to better assess the credibility and rigor of published work. Secondly, providing detailed information about a study helps to make it reproducible, which is critical to future replication efforts. Finally, the expectation of transparency encourages researchers to hold themselves and their peers accountable for conducting well- designed, adequately-powered and appropriately analyzed studies, which, ultimately, will improve the overall quality of our field's evidence. Thus, open science is important both for individual researchers to maximize their scientific contributions as well as for the field to advance trustworthy and replicable discoveries.

What sort of research practices are involved?

The lively discourse on open science has produced a wide array of recommended practices, including pre-registration, open access and registered reports. Each of these brings our field closer to the ideals of open science, but there are some important distinctions to keep in mind.

  • Pre-registration: Publicly reporting your study hypotheses as well as all conditions, outcome measures, covariates and planned analyses.

  • Open data: Publicly posting your cleaned (or raw) study data along with a codebook that explains how different variables are scored or coded in the dataset. It is critical that these data have been stripped of any personally identifying information before being made public. Moreover, if you plan to make your data open, you must state in your IRB materials and in the consent form that participants' data will be shared in this way.

  • Open materials: Publicly posting your study materials, such as your consent form, research assistant scripts, measures, surveys or questionnaires.

  • Open code: Publicly posting the code used for your study analyses. Posting the code used for your analysis is especially useful if you are also publicly posting your dataset and codebook. However, even in the absence of open data, open code may be useful for other researchers who want to replicate your analytic strategies.

  • Open access: The free, online availability of a research product and the rights to use that research product with acknowledgement. Typically, open access refers to peer-reviewed journal publications or preprints, and whether or not individual articles are publicly available for free.

  • Registered reports: This is an emerging form of publication in which an article will be accepted for publication in a particular journal based on the pre-registered study design and analysis plan, before any data are collected. The idea is for journals to prioritize well-theorized and well-designed studies, whether or not the hypotheses are ultimately supported by the data.

  • Replication research: Research that attempts to answer the same or similar questions to ones previously addressed by another study, either via direct replication (using the exact same methodology) or via conceptual replication (testing the same hypotheses or theoretical ideas, but using a different population, manipulation or measurement tool).

One common misconception is that "open access" and "open data/material/code" always exist together; however, they are not always paired, nor are they mutually exclusive. Your study materials and data could be "open" in that you have made them available online, but your actual manuscript could b e published in a non-open access journal (or vice versa). For more detailed information on any of these practices, see Crüwell et al.'s (2018) introduction to open science.

Do I need to incorporate all of these practices to be promoting open science?

Not necessarily. For researchers new to open science, an appropriate starting point may be to simply pre-register a study, or make your materials, data or code publicly available. Moreover, there may be instances where certain practices are not appropriate or feasible (e.g., when working with sensitive or identifiable data). The goal is to strive for better research practices when possible.

Ready to get started? Tips for pre-registering your next study.

First off, you will need to decide where you want to pre-register your study. A popular option is Open Science Framework (OSF), which allows you to create projects with nested pages for additional materials, updates or notes. Each nested page on OSF can be made public or kept private, which allows people to use the interface as a host for their pre-registration (and other open code, data or materials) as well as a private lab notebook shared among collaborators. AsPredicted is another good pre-registration site, but it is somewhat less flexible than OSF. AsPredicted simply has users pre-register projects by listing their co-authors and answering a series of nine questions about their hypotheses, study design and planned analyses.

Next, you'll need to create the actual pre-registration. Similar to AsPredicted, OSF has a pre-formatted option, where you can fill in the blanks with your study information. However, many find this pre-formatted style limiting, or else want to include different information than what is called for. In this case, you may prefer to upload your own file (docx, .pdf, html, etc.) to OSF. There are many templates and examples available online, which can be a helpful starting point for creating your first pre-registration file.

Once everything is uploaded, most sites give the researcher the chance to decide when the pre-registration will become publicly available. You can choose to make it immediately available, or set it to remain private ("embargo") for a predefined period (on OSF, for a maximum of four years). By setting up a n embargo, you can b e assured that potential participants will not b e able to see your study hypotheses and also that your ideas will be protected from other researchers "scooping" them before the study is published. If you decide that you want your pre-registration to become available before the selected embargo period expires, you can cancel the embargo to make it immediately available.

What if I need to change my study design or analysis plan after l've pre-registered?

You are not limited by your pre-registration. In fact, you can upload an update to your original pre-registration at any time, in an ongoing fashion. You can explain any changes that need to be made to your study protocol, or even add hypotheses as new literature is published. If you need to make changes to your analysis plan, or if you want to test your hypotheses in a different way, the key is to be transparent. Any analyses that have not been pre-registered should be described as "exploratory" or "non-pre-registered" when writing up or presenting your results.

I pre-registered my study, and now my data collection and analyses are complete. Now what?

What you do next is entirely up to you. You may choose to upload your study materials, data or analysis code to an OSF project page and provide a link to those materials in any manuscripts submitted for publication. Alternatively, you could choose not to share those materials. However, before sharing study data, you should confirm that the consent form and IRB application included verbiage indicating to your institution and to participants that their data may be shared in this way. In addition, you should make sure that your study data are fully de-identified and that there is no way to link responses back to individual participants.

You're on your way

It is important to keep in mind that best practices in the credibility revolution continue to develop. Thus, it is advisable to stay in touch with the literature a n d discussions on these topics. S o m e recommended readings are listed below. Nonetheless, the overall takeaway from open science advocates is clear: transparency is key to assessing rigor and reproducibility, both of which strengthen the future of psychological science.

References and recommended readings

Crüwell, S., van Doorn, J., Etz, A., Makel, M.C., Moshontz, H., Niebaum, J.C., ... Schulte-Mecklenbeck, M. (2018). 7 Easy Steps to Open Science: An Annotated Reading List. https://doi.org/10.31234/osf.io/cfzyx166

Munafo, M.R., Nosek, B.A., Bishop, D.V.M., Buton, K. S., Chambers, C.D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E-J., Ware, J.J., & loannidis, J.P.A. (2017). A manifesto for reproducible science. Nature Human Behavior, 1, 0021. doi: 10.1038/s41562-016-0021. https://doi.org/10.1038/s41562-016-0021

National Academies of Sciences, Engineering, and Medicine. (2019). Reproducibility and Replicability in Science. Washington, D.C.: The National Academies Press. https://doi.org/10.17226/25303.

Nosek, B.A., Beck E.D., Campbell, L., Flake, J.K., Hardwicke, T.E., Mellor, D.T., van't Veer, A.E. & Vazire, S. (2019). Preregistration is hard, and worthwhile. Trends in Cognitive Sciences. 23(10). 815-818. doi: 10.1177/1745691612465253

Nosek, B.A., Ebersole, C.R., DeHaven, A., Mellor, D.M. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115, e10518. doi: 10.1073/pnas. 1816418115. https://doi.org/10.1073/pnas.1812592115

Pashler, H., & Wagenmakers, E. (2012). Editors' introduction to the special section on replicability in psychological science: A crisis of confidence? Perspectives on Psychological Science, 7(6), 528-530. doi: 10.1016/j.tics.2019.07.009

Vazire, S. (2018). Implications of the credibility revolution for productivity, creativity, and progress. Perspectives on Psychological Science. 13(4), 411-417. https://doi.org/10.1177/1745691617751884.