x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
When to Apply?
Last registered on September 25, 2020

Pre-Trial

Trial Information
General Information
Title
When to Apply?
RCT ID
AEARCTR-0006522
Initial registration date
September 24, 2020
Last updated
September 25, 2020 2:00 PM EDT
Location(s)
Region
Primary Investigator
Affiliation
Harvard Business School
Other Primary Investigator(s)
PI Affiliation
University of Toronto
Additional Trial Information
Status
Completed
Start date
2017-08-25
End date
2017-11-16
Secondary IDs
Abstract
Labor market outcomes depend, in part, upon an individual’s willingness to put herself forward for different career opportunities. In this paper, we use both laboratory and field experiments to better understand decisions around willingness to apply for higher return, more challenging work, with a focus on gender differences. We find that, in male-typed domains, women are less likely to view themselves as well-qualified for a given opening, both because of differences in forecasts about own ability, but also, and importantly, because of differences in how high men and women believe “the bar” is. We show that these beliefs matter for application decisions. Finally, using a simple field experiment, we provide evidence that a soft-touch intervention, in which required qualifications are stated more precisely and objectively, helps to close the gender gap in willingness to apply, increasing both the average talent and the diversity of the applicant pool.
External Link(s)
Registration Citation
Citation
Coffman, Katherine and Manuela Collis. 2020. "When to Apply?." AEA RCT Registry. September 25. https://doi.org/10.1257/rct.6522-1.0.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2017-08-25
Intervention End Date
2017-11-16
Primary Outcomes
Primary Outcomes (end points)
Application decision
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
This is a field experiment on Upwork which is an online labor market platform.

We posted three versions of a job ad onto the Upwork platform. Each job ad asked participants with [management/analytical] skills expertise to apply for a short-term (1-hour) job answering brief essay questions about [management/analytical] skills. Each job ad contains two open positions (an intermediate-level and an expert-level job) and the candidate is invited to choose which job to apply to .


The three versions of the ad are very similar. All that differ are the stated job qualifications. The first ad (control version) consists of a generic sentence about who should apply to the expert level job rather than the intermediate level job. The second ad (“positive” treatment version) consists of the same generic sentence. It further includes a specific stated qualification about the expected test score of a successful applicant to the expert level job. The third ad (“normative” treatment version) consists of the generic sentence. It further includes a specific stated qualification encouraging applicants with at least a given test score to apply for the expert level job.


We invited a sample of participants on Upwork to view the job ad and consider applying. Each participant was randomly assigned to be invited to one of the three jobs postings (control, positive, normative). We recorded their application decision. We followed up with participants who did not apply before the specified deadline or who applied to both jobs with the request to specify which job they wanted to apply to. We followed up one time. We hired the four most qualified applications in each treatment to complete the job. We did not retain their completed work as part of our dataset.
Experimental Design Details
Randomization Method
By computer in Excel
Randomization Unit
Individual
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
1,200
Sample size: planned number of observations
1,200
Sample size (or number of clusters) by treatment arms
evenly split across three conditions (control, positive, normative)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Study has received IRB approval. Details not available.
IRB Approval Date
Details not available
IRB Approval Number
Details not available
IRB Name
Harvard Business School IRB
IRB Approval Date
2017-10-04
IRB Approval Number
IRB17-1504
IRB Name
Harvard Business School IRB
IRB Approval Date
2017-08-08
IRB Approval Number
IRB17-1227
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS