Analysis of the Innovation Research Boot Camp

Last registered on April 28, 2022


Trial Information

General Information

Analysis of the Innovation Research Boot Camp
Initial registration date
April 21, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 28, 2022, 5:55 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

Dartmouth College

Other Primary Investigator(s)

PI Affiliation
Northwestern University

Additional Trial Information

On going
Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
The Innovation Research Boot Camp (IRBC) aims to encourage high-quality innovation research by reducing entry barriers to the field. The IRBC will introduce more young scholars to the cutting-edge of research and equip them with leading research methods. The IRBC further seeks to overcome institutional and community barriers that limit collaborative opportunities, mentoring opportunities, the dissemination of ideas, and the diversity of thought in the field.

Measuring the impact of the IRBC will help us understand the value-added of this program both on average and across different groups of students. For example, if the program appears to have larger benefits for individuals from less elite institutions or from underrepresented groups, such findings could be used to target admission decisions in future years.

Among the students who apply and meet the program requirements, randomizing which applicants we admit versus reject provides the most transparent way to estimate the impact of the IRBC. This process can also have benefits in avoiding conflicts of interest or implicit bias in the admissions decision.
External Link(s)

Registration Citation

Jones, Ben and Heidi Williams. 2022. "Analysis of the Innovation Research Boot Camp." AEA RCT Registry. April 28.
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details


Applicants will be recruited through the same channels we would otherwise use to recruit applicants for the Innovation Research Boot Camp.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Direction of research. Does participation in the boot camp shift research topics toward innovation? We will study this outcome using Ph.D. theses (titles/abstracts/keywords), working papers, and published papers.

Collaborations. Does participation in the boot camp lead to coauthorships with other boot camp participants and/or teaching faculty and/or Summer Institute program participants?

Networking. Does participation in the boot camp lead to more submissions / attendance in NBER conferences? Does participation change opportunities for advice and extend the “invisible college” as measured through the acknowledgments in a given paper?

Job placement. Does participation change the probability of an academic job? Does it improve the quality of academic job placement?
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The randomization will occur among the applicants to the program, stratified based on application characteristics.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
Number of applicants
Sample size (or number of clusters) by treatment arms
Evenly split
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Perhaps the closest analog to this study would be the randomized control trial that was used to evaluate the CeMENT mentoring program, a two-day program established by CSWEP with support from the NSF and AEA. That program received roughly 80 applications per year. Applicants were divided into ~8-10 research fields (e.g. health economics or development) with around 8 applicants each, and within field groups applicants were randomly assigned to participate or not participate in the CeMENT program. Treated applicants were found to accrue more publications, more grant funding, and be more likely to hold a tenured or tenure-track position (Blau et al. 2010, Ginther et al. 2020, Ginther-Na 2021). While not exactly analogous to the IRBC, this work suggests that – pooling across several cohorts – we should have statistical power to detect meaningful effects on similar outcomes.

Institutional Review Boards (IRBs)

IRB Name
National Bureau of Economic Research
IRB Approval Date
IRB Approval Number
IRB Ref#22-001


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials