Job Credentials In the Labor Market

Last registered on June 02, 2024


Trial Information

General Information

Job Credentials In the Labor Market
Initial registration date
May 20, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 21, 2018, 4:35 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
June 02, 2024, 1:53 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.


Primary Investigator


Other Primary Investigator(s)

Additional Trial Information

Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
We run a randomized control trial with a company to test whether reframing language affects applications and the subsequent hiring process.
External Link(s)

Registration Citation

Abraham, Lisa. 2024. "Job Credentials In the Labor Market ." AEA RCT Registry. June 02.
Former Citation
Abraham, Lisa. 2024. "Job Credentials In the Labor Market ." AEA RCT Registry. June 02.
Sponsors & Partners


Experimental Details


We partner with a company to examine the role of language in job postings.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Number and gender composition of applications, phone screens, interviews, and offers in both the Treated and Control groups. Data on applicant quality from applications.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The experiment randomizes individuals who visit the Company’s website into one of two groups: a Control group where individuals see the same job posting, and a Treatment group where individuals see a version of the job posting with reframed language. We then examine differences in the application process between the two groups.
Experimental Design Details
The experiment randomizes individuals who directly visit the Company’s career website. We identify individuals by their IP address and randomly sort them into one of two groups: (1) a Control group where individuals see the original version of the job posting, or (2) a Treatment group, where individuals see a version of the job posting which delete optional credentials, delete adjectives describing skills, and reframe vague credentials.

The Treatment varies depending on the job posting. In some job postings, there are lots of optional qualifications (e.g., “PhD preferred), adjectives describing skills (e.g., “Excellent” before “coding skills”), and vague language/criteria (e.g., “think like your enemy”). We document the treatment for every job posting included in our experiment (i.e., optional qualifications deleted, adjectives deleted, vague language/criteria reframed).

Upon seeing the job posting on the Company’s career website, individuals decide whether or not to apply for the position. For those who decide to apply, the first step is to click on the “Apply Now” button, after which the individual is given the option of applying by signing in (if they already have an account on the Company’s career website), applying using their LinkedIn profile (where data from the individual’s LinkedIn profile is transmitted to the Company), submitting their resume, or applying “manually” by filling out responses to a series of questions.

We are able to track the treatment status of the applicants in our experiment throughout the hiring process (application, initial phone screen, technical phone screen, on-site interview, and offer). If an individual applies to more than one job posting at the Company over the duration of the experiment, we are able to examine the portfolio of their application choices.
Randomization Method
Randomization via computer
Randomization Unit
IP address of individual visiting Company website
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Approximately 150,000 IP address-identified individuals
Sample size: planned number of observations
Approximately 150,000 IP address-identified individuals
Sample size (or number of clusters) by treatment arms
Treatment group: approximately 75,000 IP address-identified individuals
Control group: approximately 75,000 IP address-identified individuals
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials


Document Name
Treatment Tracker
Document Type
Document Description
This file displays the Treated and Control qualifications section for each of the job postings included in the experiment.
Treatment Tracker

MD5: 3b4eda89e3fc8ed64a0efdba2484d186

SHA1: 7b3855814155b8480a550cbf06152ee6b49f52aa

Uploaded At: August 14, 2020


Institutional Review Boards (IRBs)

IRB Name
Harvard University-Area Committee on the Use of Human Subjects
IRB Approval Date
IRB Approval Number
Analysis Plan

Analysis Plan Documents


MD5: 170c38b6bd87cfb3a14b4044e29e42f2

SHA1: 632550b6b13eb20f8435a00665e2e2e8607ee381

Uploaded At: October 03, 2018


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Intervention Completion Date
December 14, 2018, 12:00 +00:00
Data Collection Complete
Data Collection Completion Date
December 14, 2018, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
~30000 in each treatment arm
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

If women are more sensitive to listed qualifications in job ads, does lowering the bar draw in relatively more women and increase diversity in the applicant pool? We examine this question by randomizing 60,000 viewers into one of two job ad versions for over 600 corporate jobs at Uber, where the treatment removed optional and superfluous qualifications. There are two main findings. First, job seekers of both genders respond to qualifications: applications increase by 7%, owing to similar increases in the number of applications from men and women. Second, reducing the qualifications impacts the type of individual who chooses to apply differently by gender. Reducing the qualifications draws in less skilled women and causes an outflow of some highly skilled women. Conversely, the treatment draws in men from across the skill distribution, including the upper end. We find gender differences in application behavior and explore potential mechanisms in a separate, large-scale survey using the RAND American Life Panel. These results highlight that sensitivity to listed requirements is complex, and simply lowering the qualifications in job postings is not guaranteed to increase applicant diversity.

Reports & Other Materials

Final Treatment Tracker
Abraham, Lisa. 2024. "Job Credentials In the Labor Market ." AEA RCT Registry. June 02. 2024. "Registration Entry Title: Final Treatment Tracker ." AEA RCT Registry. June 02



Uploaded At: June 02, 2024