Support for Open Science Practices in Emotion Science: A Survey Study

2020 ◽  
Author(s):  
Olmo Van den Akker ◽  
Laura Danielle Scherer ◽  
Jelte M. Wicherts ◽  
Sander Koole

So-called “open science practices” seek to improve research transparency and methodological rigor. What do emotion researchers think about these practices? To address this question, we surveyed active emotion researchers (N= 144) in October 2019 about their attitudes toward several open science practices. Overall, the majority of emotion researchers had positive attitudes toward open science practices and expressed a willingness to engage in such practices. Emotion researchers on average believed that replicability would improve by publishing more negative findings, by requiring open data and materials, and by conducting studies with larger sample sizes. Direct replications, multi-lab studies, and preregistration were all seen as beneficial to the replicability of emotion research. Emotion researchers believed that more direct replications would be conducted if replication studies would receive increased funding, more citations, and easier publication in high impact journals. Emotion researchers believed that preregistration would be stimulated by providing researchers with more information about its benefits and more guidance on its effective application. Overall, these findings point to considerable momentum with regard to open science among emotion researchers. This momentum may be leveraged to achieve a more robust emotion science.

2021 ◽  
Author(s):  
Eric R. Louderback ◽  
Sally M Gainsbury ◽  
Robert Heirene ◽  
Karen Amichia ◽  
Alessandra Grossman ◽  
...  

The replication crisis has stimulated researchers around the world to adopt open science research practices intended to reduce publication bias and improve research quality. Open science practices include study pre-registration, open data, open publication, and avoiding methods that can lead to publication bias and low replication rates. Although gambling studies uses similar research methods to behavioral research fields that have struggled with replication, we know little about the uptake of open science research practices in gambling-focused research. We conducted a scoping review of 500 recent (1/1/2016 – 12/1/2019) studies focused on gambling and problem gambling to examine the use of open science and transparent research practices. Our results showed that a small percentage of studies used most practices: whereas 54.6% (95% CI: [50.2, 58.9]) of studies used at least one of nine open science practices, each practice’s prevalence was: 1.6% for pre-registration (95% CI:[0.8, 3.1]), 3.2% for open data (95% CI:[2.0, 5.1]), 0% for open notebook, 35.2% for open access (95% CI:[31.1, 39.5]), 7.8% for open materials (95% CI:[5.8, 10.5]), 1.4% for open code (95% CI:[0.7, 2.9]), and 15.0% for preprint posting (95% CI:[12.1, 18.4]). In all, 6.4% (95% CI:[4.6, 8.9]) used a power analysis and 2.4% (95% CI:[1.4, 4.2]) of the studies were replication studies. Exploratory analyses showed that studies that used any open science practice, and open access in particular, had higher citation counts. We suggest several practical ways to enhance the uptake of open science principles and practices both within gambling studies and in science more broadly.


2020 ◽  
Vol 36 (3) ◽  
pp. 263-279
Author(s):  
Isabel Steinhardt

Openness in science and education is increasing in importance within the digital knowledge society. So far, less attention has been paid to teaching Open Science in bachelor’s degrees or in qualitative methods. Therefore, the aim of this article is to use a seminar example to explore what Open Science practices can be taught in qualitative research and how digital tools can be involved. The seminar focused on the following practices: Open data practices, the practice of using the free and open source tool “Collaborative online Interpretation, the practice of participating, cooperating, collaborating and contributing through participatory technologies and in social (based) networks. To learn Open Science practices, the students were involved in a qualitative research project about “Use of digital technologies for the study and habitus of students”. The study shows the practices of Open Data are easy to teach, whereas the use of free and open source tools and participatory technologies for collaboration, participation, cooperation and contribution is more difficult. In addition, a cultural shift would have to take place within German universities to promote Open Science practices in general.


2021 ◽  
Author(s):  
Tamara Kalandadze ◽  
Sara Ann Hart

The increasing adoption of open science practices in the last decade has been changing the scientific landscape across fields. However, developmental science has been relatively slow in adopting open science practices. To address this issue, we followed the format of Crüwell et al., (2019) and created summaries and an annotated list of informative and actionable resources discussing ten topics in developmental science: Open science; Reproducibility and replication; Open data, materials and code; Open access; Preregistration; Registered reports; Replication; Incentives; Collaborative developmental science.This article offers researchers and students in developmental science a starting point for understanding how open science intersects with developmental science. After getting familiarized with this article, the developmental scientist should understand the core tenets of open and reproducible developmental science, and feel motivated to start applying open science practices in their workflow.


2019 ◽  
Author(s):  
Olivia J Kirtley ◽  
Ginette Lafit ◽  
Robin Achterhof ◽  
Anu Pauliina Hiekkaranta ◽  
Inez Myin-Germeys

A growing interest in understanding complex and dynamic psychological processes as they occur in everyday life has led to an increase in studies using Ambulatory Assessment techniques, including the Experience Sampling Method (ESM) and Ecological Momentary Assessment (EMA). There are, however, numerous “forking paths” and researcher degrees of freedom, even beyond those typically encountered with other research methodologies. Whilst a number of researchers working with ESM techniques are actively engaged in efforts to increase the methodological rigor and transparency of such research, currently, there is little routine implementation of open science practices in ESM research. In the current paper, we discuss the ways in which ESM research is especially vulnerable to threats to transparency, reproducibility and replicability. We propose that greater use of (pre-)registration, a cornerstone of open science, may address some of these threats to the transparency of ESM research. (Pre-)registration of ESM research is not without challenges, including model selection, accounting for potential model convergence issues and the use of pre-existing datasets. As these may prove to be significant barriers to (pre-)registration for ESM researchers, we also discuss ways of overcoming these challenges and of documenting them in a (pre-)registration. A further challenge is that current general templates do not adequately capture the unique features of ESM. Here we present a (pre-)registration template for ESM research, adapted from the original Pre-Registration Challenge (Mellor et al., 2019) and pre-registration of pre-existing data (van den Akker et al., 2020) templates, and provide examples of how to complete this.


2021 ◽  
Author(s):  
Elisabeth Weir ◽  
Danielle Reed ◽  
M. Yanina Pepino ◽  
Maria Veldhuizen ◽  
John Hayes

In March 2020, the Global Consortium of Chemosensory Research (GCCR) was founded by chemosensory researchers to address then emerging reports of unusual smell and taste dysfunction arising from the SARS-CoV-2 pandemic. Over the next year, the GCCR used a highly collaborative model, along with contemporary Open Science practices, to produce multiple high impact publications on chemosensation and COVID19. This invited manuscript describes the founding of the GCCR, the tools and approaches it used, and a summary of findings to date. These findings are contextualized within a summary of some of the broader insights about chemosensation (smell, taste, and chemesthesis) and COVID19 gained over the last 18 months, including potential mechanisms of loss. Also, it includes a detailed discussion of some current Open Science approaches and practices used by the GCCR to increase transparency, rigor, and reproducibility.


2020 ◽  
Author(s):  
Denis Cousineau

Born-Open Data experiments are encouraged for better open science practices. To be adopted, Born-Open data practices must be easy to implement. Herein, I introduce a package for E-Prime such that the data files are automatically saved on a GitHub repository. The BornOpenData package for E-Prime works seamlessly and performs the upload as soon as the experiment is finished so that there is no additional steps to perform beyond placing a package call within E-Prime. Because E-Prime files are not standard tab-separated files, I also provide an R function that retrieves the data directly from GitHub into a data frame ready to be analyzed. At this time, there are no standards as to what should constitute an adequate open-access data repository so I propose a few suggestions that any future Born-Open data system could follow for easier use by the research community.


2022 ◽  
Author(s):  
Bermond Scoggins ◽  
Matthew Peter Robertson

The scientific method is predicated on transparency -- yet the pace at which transparent research practices are being adopted by the scientific community is slow. The replication crisis in psychology showed that published findings employing statistical inference are threatened by undetected errors, data manipulation, and data falsification. To mitigate these problems and bolster research credibility, open data and preregistration have increasingly been adopted in the natural and social sciences. While many political science and international relations journals have committed to implementing these reforms, the extent of open science practices is unknown. We bring large-scale text analysis and machine learning classifiers to bear on the question. Using population-level data -- 93,931 articles across the top 160 political science and IR journals between 2010 and 2021 -- we find that approximately 21% of all statistical inference papers have open data, and 5% of all experiments are preregistered. Despite this shortfall, the example of leading journals in the field shows that change is feasible and can be effected quickly.


2018 ◽  
Author(s):  
Gerit Pfuhl ◽  
Jon Grahe

Watch the VIDEO.Recent years have seen a revolution in publishing, and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials, and preregistration. To accelerate the transition and make open science the new standard, the collaborative replications and education project (CREP; http://osf.io/wfc6u/)) was launched in 2013, hosted on the Open Science Framework (osf.io). OSF is like a preprint, collecting partial data with each individual contributors project. CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the psychological science accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences. The Department of Psychology at UiT is part of the network and has two ongoing CREP studies, maintaining open science practices early on. In this talk, we will present our experiences of conducting transparent replicable research, and experience with preprints from a supervisor and researcher perspective.


2021 ◽  
Author(s):  
Robert Heirene ◽  
Debi LaPlante ◽  
Eric R. Louderback ◽  
Brittany Keen ◽  
Marjan Bakker ◽  
...  

Study preregistration is one of several “open science” practices (e.g., open data, preprints) that researchers use to improve the transparency and rigour of their research. As more researchers adopt preregistration as a regular research practice, examining the nature and content of preregistrations can help identify strengths and weaknesses of current practices. The value of preregistration, in part, relates to the specificity of the study plan and the extent to which investigators adhere to this plan. We identified 53 preregistrations from the gambling studies field meeting our predefined eligibility criteria and scored their level of specificity using a 23-item protocol developed to measure the extent to which a clear and exhaustive preregistration plan restricts various researcher degrees of freedom (RDoF; i.e., the many methodological choices available to researchers when collecting and analysing data, and when reporting their findings). We also scored studies on a 32-item protocol that measured adherence to the preregistered plan in the study manuscript. We found that gambling preregistrations had low specificity levels on most RDoF. However, a comparison with a sample of cross-disciplinary preregistrations (N = 52; Bakker et al., 2020) indicated that gambling preregistrations scored higher on 12 (of 29) items. Thirteen (65%) of the 20 associated published articles or preprints deviated from the protocol without declaring as much (the mean number of undeclared deviations per article was 2.25, SD = 2.34). Overall, while we found improvements in specificity and adherence over time (2017-2020), our findings suggest the purported benefits of preregistration—including increasing transparency and reducing RDoF—are not fully achieved by current practices. Using our findings, we provide 10 practical recommendations that can be used to support and refine preregistration practices.


2021 ◽  
Vol 35 (3) ◽  
pp. 193-214
Author(s):  
Edward Miguel

A decade ago, the term “research transparency” was not on economists' radar screen, but in a few short years a scholarly movement has emerged to bring new open science practices, tools and norms into the mainstream of our discipline. The goal of this article is to lay out the evidence on the adoption of these approaches – in three specific areas: open data, pre-registration and pre-analysis plans, and journal policies – and, more tentatively, begin to assess their impacts on the quality and credibility of economics research. The evidence to date indicates that economics (and related quantitative social science fields) are in a period of rapid transition toward new transparency-enhancing norms. While solid data on the benefits of these practices in economics is still limited, in part due to their relatively recent adoption, there is growing reason to believe that critics' worst fears regarding onerous adoption costs have not been realized. Finally, the article presents a set of frontier questions and potential innovations.


Sign in / Sign up

Export Citation Format

Share Document