x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Job Credentials In the Labor Market
Last registered on August 14, 2020

Pre-Trial

Trial Information
General Information
Title
Job Credentials In the Labor Market
RCT ID
AEARCTR-0002963
Initial registration date
May 20, 2018
Last updated
August 14, 2020 3:17 PM EDT
Location(s)
Primary Investigator
Affiliation
Other Primary Investigator(s)
Additional Trial Information
Status
Completed
Start date
2018-08-13
End date
2020-07-20
Secondary IDs
Abstract
We run a randomized control trial with a company to test whether reframing language affects applications and the subsequent hiring process.
External Link(s)
Registration Citation
Citation
Abraham, Lisa. 2020. "Job Credentials In the Labor Market ." AEA RCT Registry. August 14. https://doi.org/10.1257/rct.2963-20.1.
Former Citation
Abraham, Lisa. 2020. "Job Credentials In the Labor Market ." AEA RCT Registry. August 14. http://www.socialscienceregistry.org/trials/2963/history/73981.
Sponsors & Partners
Partner(s)
Type
private_company
Experimental Details
Interventions
Intervention(s)
We partner with a company to examine the role of language in job postings.
Intervention Start Date
2018-08-13
Intervention End Date
2018-12-14
Primary Outcomes
Primary Outcomes (end points)
Number and gender composition of applications, phone screens, interviews, and offers in both the Treated and Control groups. Data on applicant quality from applications.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
The experiment randomizes individuals who visit the Company’s website into one of two groups: a Control group where individuals see the same job posting, and a Treatment group where individuals see a version of the job posting with reframed language. We then examine differences in the application process between the two groups.
Experimental Design Details
The experiment randomizes individuals who directly visit the Company’s career website. We identify individuals by their IP address and randomly sort them into one of two groups: (1) a Control group where individuals see the original version of the job posting, or (2) a Treatment group, where individuals see a version of the job posting which delete optional credentials, delete adjectives describing skills, and reframe vague credentials. The Treatment varies depending on the job posting. In some job postings, there are lots of optional qualifications (e.g., “PhD preferred), adjectives describing skills (e.g., “Excellent” before “coding skills”), and vague language/criteria (e.g., “think like your enemy”). We document the treatment for every job posting included in our experiment (i.e., optional qualifications deleted, adjectives deleted, vague language/criteria reframed). Upon seeing the job posting on the Company’s career website, individuals decide whether or not to apply for the position. For those who decide to apply, the first step is to click on the “Apply Now” button, after which the individual is given the option of applying by signing in (if they already have an account on the Company’s career website), applying using their LinkedIn profile (where data from the individual’s LinkedIn profile is transmitted to the Company), submitting their resume, or applying “manually” by filling out responses to a series of questions. We are able to track the treatment status of the applicants in our experiment throughout the hiring process (application, initial phone screen, technical phone screen, on-site interview, and offer). If an individual applies to more than one job posting at the Company over the duration of the experiment, we are able to examine the portfolio of their application choices.
Randomization Method
Randomization via computer
Randomization Unit
IP address of individual visiting Company website
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
Approximately 150,000 IP address-identified individuals
Sample size: planned number of observations
Approximately 150,000 IP address-identified individuals
Sample size (or number of clusters) by treatment arms
Treatment group: approximately 75,000 IP address-identified individuals
Control group: approximately 75,000 IP address-identified individuals
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials
Documents
Document Name
Treatment Tracker
Document Type
other
Document Description
This file displays the Treated and Control qualifications section for each of the job postings included in the experiment.
File
Treatment Tracker

MD5: 3b4eda89e3fc8ed64a0efdba2484d186

SHA1: 7b3855814155b8480a550cbf06152ee6b49f52aa

Uploaded At: August 14, 2020

IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Harvard University-Area Committee on the Use of Human Subjects
IRB Approval Date
2017-12-19
IRB Approval Number
IRB17-1703
Analysis Plan
Analysis Plan Documents
Pre-Analysis Plan_Abraham.pdf

MD5: 170c38b6bd87cfb3a14b4044e29e42f2

SHA1: 632550b6b13eb20f8435a00665e2e2e8607ee381

Uploaded At: October 03, 2018

Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
Yes
Intervention Completion Date
December 14, 2018, 12:00 AM +00:00
Is data collection complete?
Yes
Data Collection Completion Date
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS