Analysis of the Innovation Research Boot Camp

Last registered on April 28, 2022

Pre-Trial

Trial Information

General Information

Title
Analysis of the Innovation Research Boot Camp
RCT ID
AEARCTR-0009292
Initial registration date
April 21, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 28, 2022, 5:55 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Dartmouth College

Other Primary Investigator(s)

PI Affiliation
Northwestern University

Additional Trial Information

Status
On going
Start date
2022-03-12
End date
2023-06-24
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The Innovation Research Boot Camp (IRBC) aims to encourage high-quality innovation research by reducing entry barriers to the field. The IRBC will introduce more young scholars to the cutting-edge of research and equip them with leading research methods. The IRBC further seeks to overcome institutional and community barriers that limit collaborative opportunities, mentoring opportunities, the dissemination of ideas, and the diversity of thought in the field.

Measuring the impact of the IRBC will help us understand the value-added of this program both on average and across different groups of students. For example, if the program appears to have larger benefits for individuals from less elite institutions or from underrepresented groups, such findings could be used to target admission decisions in future years.

Among the students who apply and meet the program requirements, randomizing which applicants we admit versus reject provides the most transparent way to estimate the impact of the IRBC. This process can also have benefits in avoiding conflicts of interest or implicit bias in the admissions decision.
External Link(s)

Registration Citation

Citation
Jones, Ben and Heidi Williams. 2022. "Analysis of the Innovation Research Boot Camp." AEA RCT Registry. April 28. https://doi.org/10.1257/rct.9292-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Applicants will be recruited through the same channels we would otherwise use to recruit applicants for the Innovation Research Boot Camp.
Intervention Start Date
2022-03-12
Intervention End Date
2022-07-22

Primary Outcomes

Primary Outcomes (end points)
Direction of research. Does participation in the boot camp shift research topics toward innovation? We will study this outcome using Ph.D. theses (titles/abstracts/keywords), working papers, and published papers.

Collaborations. Does participation in the boot camp lead to coauthorships with other boot camp participants and/or teaching faculty and/or Summer Institute program participants?

Networking. Does participation in the boot camp lead to more submissions / attendance in NBER conferences? Does participation change opportunities for advice and extend the “invisible college” as measured through the acknowledgments in a given paper?

Job placement. Does participation change the probability of an academic job? Does it improve the quality of academic job placement?
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The randomization will occur among the applicants to the program, stratified based on application characteristics.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
Applicant
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
N/A
Sample size: planned number of observations
Number of applicants
Sample size (or number of clusters) by treatment arms
Evenly split
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Perhaps the closest analog to this study would be the randomized control trial that was used to evaluate the CeMENT mentoring program, a two-day program established by CSWEP with support from the NSF and AEA. That program received roughly 80 applications per year. Applicants were divided into ~8-10 research fields (e.g. health economics or development) with around 8 applicants each, and within field groups applicants were randomly assigned to participate or not participate in the CeMENT program. Treated applicants were found to accrue more publications, more grant funding, and be more likely to hold a tenured or tenure-track position (Blau et al. 2010, Ginther et al. 2020, Ginther-Na 2021). While not exactly analogous to the IRBC, this work suggests that – pooling across several cohorts – we should have statistical power to detect meaningful effects on similar outcomes.
IRB

Institutional Review Boards (IRBs)

IRB Name
National Bureau of Economic Research
IRB Approval Date
2022-01-20
IRB Approval Number
IRB Ref#22-001

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials