When to Apply?

Last registered on September 25, 2020


Trial Information

General Information

When to Apply?
Initial registration date
September 24, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 25, 2020, 2:00 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

Harvard Business School

Other Primary Investigator(s)

PI Affiliation
University of Toronto

Additional Trial Information

Start date
End date
Secondary IDs
Labor market outcomes depend, in part, upon an individual’s willingness to put herself forward for different career opportunities. In this paper, we use both laboratory and field experiments to better understand decisions around willingness to apply for higher return, more challenging work, with a focus on gender differences. We find that, in male-typed domains, women are less likely to view themselves as well-qualified for a given opening, both because of differences in forecasts about own ability, but also, and importantly, because of differences in how high men and women believe “the bar” is. We show that these beliefs matter for application decisions. Finally, using a simple field experiment, we provide evidence that a soft-touch intervention, in which required qualifications are stated more precisely and objectively, helps to close the gender gap in willingness to apply, increasing both the average talent and the diversity of the applicant pool.
External Link(s)

Registration Citation

Coffman, Katherine and Manuela Collis. 2020. "When to Apply?." AEA RCT Registry. September 25. https://doi.org/10.1257/rct.6522
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details


Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Application decision
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This is a field experiment on Upwork which is an online labor market platform.

We posted three versions of a job ad onto the Upwork platform. Each job ad asked participants with [management/analytical] skills expertise to apply for a short-term (1-hour) job answering brief essay questions about [management/analytical] skills. Each job ad contains two open positions (an intermediate-level and an expert-level job) and the candidate is invited to choose which job to apply to .

The three versions of the ad are very similar. All that differ are the stated job qualifications. The first ad (control version) consists of a generic sentence about who should apply to the expert level job rather than the intermediate level job. The second ad (“positive” treatment version) consists of the same generic sentence. It further includes a specific stated qualification about the expected test score of a successful applicant to the expert level job. The third ad (“normative” treatment version) consists of the generic sentence. It further includes a specific stated qualification encouraging applicants with at least a given test score to apply for the expert level job.

We invited a sample of participants on Upwork to view the job ad and consider applying. Each participant was randomly assigned to be invited to one of the three jobs postings (control, positive, normative). We recorded their application decision. We followed up with participants who did not apply before the specified deadline or who applied to both jobs with the request to specify which job they wanted to apply to. We followed up one time. We hired the four most qualified applications in each treatment to complete the job. We did not retain their completed work as part of our dataset.
Experimental Design Details
Randomization Method
By computer in Excel
Randomization Unit
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
Sample size (or number of clusters) by treatment arms
evenly split across three conditions (control, positive, normative)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
Study has received IRB approval. Details not available.
IRB Approval Date
Details not available
IRB Approval Number
Details not available
IRB Name
Harvard Business School IRB
IRB Approval Date
IRB Approval Number
IRB Name
Harvard Business School IRB
IRB Approval Date
IRB Approval Number


Post Trial Information

Study Withdrawal

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials