Competition and Quality in Science

Last registered on May 17, 2023

Pre-Trial

Trial Information

General Information

Title
Competition and Quality in Science
RCT ID
AEARCTR-0011356
Initial registration date
May 09, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 17, 2023, 2:12 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
UC Berkeley

Other Primary Investigator(s)

PI Affiliation
Northwestern University

Additional Trial Information

Status
In development
Start date
2023-05-10
End date
2023-05-24
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This is a survey experiment to assess whether competition affects the quality of scientific research. We will reach out to corresponding authors on scientific papers via a fully anonymous email survey.

We will cross-randomize across two sets of questions. The first set of questions will assess whether scientists think that important work is more likely to be competitive. We will pose a hypothetical scenario about working on a project, but randomly vary the expected journal placement of the project. We will then ask scientists how likely they think it is that another team is working on a similar project.

The second set of questions will assess whether scientists work faster and less carefully if they believe the project is competitive (i.e., multiple parties are working on the same project). We will pose a hypothetical scenario about working on a project, but randomly vary the probability that another team is working on the project. We will then ask scientists (a) how long will they spend finishing the project and (b) which tasks would the do prior to publication (replication, additional analyses, etc.)
External Link(s)

Registration Citation

Citation
Hill, Ryan and Carolyn Stein. 2023. "Competition and Quality in Science." AEA RCT Registry. May 17. https://doi.org/10.1257/rct.11356-1.0
Experimental Details

Interventions

Intervention(s)
The intervention is an optional email survey conducted via Qualtrics. No PII will be collected. See the experimental design for more details.
Intervention (Hidden)
Intervention Start Date
2023-05-10
Intervention End Date
2023-05-24

Primary Outcomes

Primary Outcomes (end points)
Survey responses. See experimental design for more details.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
STUDY POPULATION:

We have selected 11 scientific fields, using field classifications from Microsoft Academic Graph (MAG). MAG assigns publications to fields. Researchers were assigned to fields using the modal field of their publications. We had two criteria when selecting the fields. (1) The field should be sufficiently large such that we have a large enough sample; (2) the field should be largely experimental. We proxy for experimental by flagging whether the title or abstract contains the word "experiment." The selected fields are the following:

Structural biology
Cell biology
Ecology
Horticulture & agronomy
Immunology
Biochemistry
Inorganic chemistry
Condensed matter physics
Optics
Social psychology


SURVEY EXPERIMENT:

Every recipient will be sent an email survey via Qualtrics. Two of the question modules will be cross-randomized.

Module 1. Randomized text is in the square brackets.

Consider the following scenario: You are working on a project and you have generated some preliminary results. Based on the research question and your results, you expect that it will publish in a [high impact journal (such as Science, Nature, or the top journal in your field) / medium impact field journal].

Q1. How likely is it that another research team is working on a very similar project? (0 to 100% slider)

Module 2. Randomized text is in the square brackets.

Consider a different scenario: Suppose you have generated some preliminary results for a project. You are fairly confident that [nobody else is working on a very similar project (less than a 10% chance) / another team is working on a very similar project (greater than a 90% chance)]. Answer the following three questions with this scenario in mind.

Q1. How long would it take for you to complete the project and submit the paper to a journal? (sliding scale 0 to 24 months)

Q2. Which of the following would you do prior to submitting the paper?
Re-run or replicate key experiments (Yes/Maybe/No/NA)
Run additional supporting experiments (Yes/Maybe/No/NA)
Perform a thorough code review (Yes/Maybe/No/NA)
Perform a thorough review of any mathematical or analytical analyses (Yes/Maybe/No/NA)
Perform a thorough proofreading of the manuscript (Yes/Maybe/No/NA)

Q3. Would you do anything else prior to submission not listed above? (free-form text; optional)

Module 3. No randomization

Q1. In general, how would you rate the competition to publish first in your field? (1-4 scale)

Q2. In general, do you feel that peers in your field ever sacrifice the quality of their research in order to publish first? (1-4 scale)

Q3. When you are reading published research in your field, how do you assess the quality of the work? Is there a systematic way to measure high quality, careful research in your field? (free form text; optional)

Experimental Design Details
Randomization Method
Randomization done on a computer via Stata code.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
No clustering. About 100,000 individuals.
Sample size: planned number of observations
We plan to survey 10,000 scientists per field. If the field has less than 10,000 scientists, we will survey all of them. If a field has more than 10,000 scientists, we will randomly select 10,000. This will lead to a total sample size of around 100,000 scientists. Structural biology: 3,404 Agronomy: 7,893 Animal sciences: 3,179 Biophysics: 1,672 Cell biology: 10,000 Ecology: 10,000 Biochemistry: 10,000 Inorganic chemistry: 10,000 Condensed matter physics: 10,000 Optics: 10,000 Social psychology: 10,000
Sample size (or number of clusters) by treatment arms
N/A.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
POWER CALCULATION AND MINIMUM DETECTABLE EFFECTS Based on our piloting, we expect a 8% response rate. That translates to a full sample (across all fields) of 8,000. For structural biology (our key field of interest), this translates to a sample of 800. We calculate the MDE for our full sample and structural biology sample. Module 1, Q1. Based on piloting, the SD = 27 and the mean of the control group is 54. In the full sample, MDE = (2*27*(1.96+0.84)) / sqrt(6,080) = 1.7 percentage points (3.1% increase/decrease). In the structural biology sample, MDE = 2*27*(1.96+0.84)) / sqrt(800) = 5.3 percentage points (9.8% increase/decrease). Module 2, Q2. Based on piloting (using a 12 month scale), the SD = 3.7 and the mean of the control group is 8.4. In the full sample, MDE = (2*3.7*(1.96+0.84)) / sqrt(8000) = 0.23 months (2.7% increase/decrease). In the structural biology sample, MDE = 2*3.7*(1.96+0.84)) / sqrt(800) = 0.73 months (8.7% increase/decrease). Module 3, Q3. Based on piloting, the SD of our preferred outcome (would you run additional experiments) has an SD = 0.67 and the mean of the control group is 1.6. In the full sample, MDE = (2*0.67*(1.96+0.84)) / sqrt(8000) = 0.042 points (2.6% increase/decrease). In the structural biology sample, MDE = 2*0.67*(1.96+0.84)) / sqrt(800) = 0.13 points (8.1% increase/decrease).
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials