Back to History Current Version

There must be an error here! Experimental evidence on coding errors' biases

Last registered on October 07, 2021

Pre-Trial

Trial Information

General Information

Title
There must be an error here! Experimental evidence on coding errors' biases
RCT ID
AEARCTR-0008312
Initial registration date
October 05, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 07, 2021, 4:05 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Sao Paulo School of Economics - FGV

Other Primary Investigator(s)

PI Affiliation
Yale University

Additional Trial Information

Status
In development
Start date
2021-10-04
End date
2023-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Empirical research generally involves data manipulation and analysis that is subject to coding errors. We investigate whether the probability of detecting coding errors depends on the nature of the results.
External Link(s)

Registration Citation

Citation
Ferman, Bruno and Lucas Finamor. 2021. "There must be an error here! Experimental evidence on coding errors' biases." AEA RCT Registry. October 07. https://doi.org/10.1257/rct.8312-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2021-10-04
Intervention End Date
2022-12-31

Primary Outcomes

Primary Outcomes (end points)
Indicator variable determining whether the experimental question was correctly answered.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Subjects will have to write codes to answer some questions. We randomize whether the dataset they use leads them to find expected or unexpected findings in case they do a coding mistake.
Experimental Design Details
The study will take place in a recruitment process for research assistants of a Partner Institute. In this recruitment process, candidates are asked to perform a data task to evaluate their coding abilities.

Candidates in this recruitment process will receive by email (from the Partner Institute) a data task that is embedded in an online survey created using the software Qualtrics.

The data task will include a number of questions that candidates will have to answer using an appropriate statistical software. One of the questions in this data task requires candidates to perform an activity that is subject to a common coding error. We randomize the questionnaire and the dataset that the candidates receive so that they would find expected or unexpected results in case they did this coding mistake.

The partner institution will then send the researchers anonymized data from the candidates that agreed with having their anonymized data shared for research purposes.
Randomization Method
Randomization is done in the Qualtrics software.
Randomization Unit
Randomization is done at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Randomization will be done at the individual level.
Sample size: planned number of observations
We expect to have a sample of 800 subjects.
Sample size (or number of clusters) by treatment arms
400 in the treatment and 400 in the control group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Yale Human Research Protection Program
IRB Approval Date
2021-02-01
IRB Approval Number
2000029851
IRB Name
RESEARCH ETHICS COMMITTEE OF FUNDAÇÃO GETULIO VARGAS
IRB Approval Date
2021-03-05
IRB Approval Number
020/2021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials