Back to History Current Version

Job Credentials In the Labor Market

Last registered on September 01, 2018

Pre-Trial

Trial Information

General Information

Title
Job Credentials In the Labor Market
RCT ID
AEARCTR-0002963
Initial registration date
May 20, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 21, 2018, 4:35 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
September 01, 2018, 4:32 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation

Other Primary Investigator(s)

Additional Trial Information

Status
On going
Start date
2018-08-13
End date
2021-12-20
Secondary IDs
Abstract
We run a randomized control trial with a company to test whether reframing language affects applications and the subsequent hiring process.
External Link(s)

Registration Citation

Citation
Abraham, Lisa. 2018. "Job Credentials In the Labor Market ." AEA RCT Registry. September 01. https://doi.org/10.1257/rct.2963-11.0
Former Citation
Abraham, Lisa. 2018. "Job Credentials In the Labor Market ." AEA RCT Registry. September 01. https://www.socialscienceregistry.org/trials/2963/history/33765
Sponsors & Partners

Partner

Type
private_company

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We partner with a company to examine the role of language in job postings.
Intervention Start Date
2018-08-13
Intervention End Date
2019-08-01

Primary Outcomes

Primary Outcomes (end points)
Number and gender composition of applications, phone screens, interviews, and offers in both the Treated and Control groups. Data on applicant quality from applications.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The experiment randomizes individuals who visit the Company’s website into one of two groups: a Control group where individuals see the same job posting, and a Treatment group where individuals see a version of the job posting with reframed language. We then examine differences in the application process between the two groups.
Experimental Design Details
The experiment randomizes individuals who directly visit the Company’s career website. We identify individuals by their IP address and randomly sort them into one of two groups: (1) a Control group where individuals see the original version of the job posting, or (2) a Treatment group, where individuals see a version of the job posting which delete optional credentials, delete adjectives describing skills, and reframe vague credentials.

The Treatment varies depending on the job posting. In some job postings, there are lots of optional qualifications (e.g., “PhD preferred), adjectives describing skills (e.g., “Excellent” before “coding skills”), and vague language/criteria (e.g., “think like your enemy”). We document the treatment for every job posting included in our experiment (i.e., optional qualifications deleted, adjectives deleted, vague language/criteria reframed).

Upon seeing the job posting on the Company’s career website, individuals decide whether or not to apply for the position. For those who decide to apply, the first step is to click on the “Apply Now” button, after which the individual is given the option of applying by signing in (if they already have an account on the Company’s career website), applying using their LinkedIn profile (where data from the individual’s LinkedIn profile is transmitted to the Company), submitting their resume, or applying “manually” by filling out responses to a series of questions.

We are able to track the treatment status of the applicants in our experiment throughout the hiring process (application, initial phone screen, technical phone screen, on-site interview, and offer). If an individual applies to more than one job posting at the Company over the duration of the experiment, we are able to examine the portfolio of their application choices.
Randomization Method
Randomization via computer
Randomization Unit
IP address of individual visiting Company website
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Approximately 150,000 IP address-identified individuals
Sample size: planned number of observations
Approximately 150,000 IP address-identified individuals
Sample size (or number of clusters) by treatment arms
Treatment group: approximately 75,000 IP address-identified individuals
Control group: approximately 75,000 IP address-identified individuals
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
Harvard University-Area Committee on the Use of Human Subjects
IRB Approval Date
2017-12-19
IRB Approval Number
IRB17-1703
Analysis Plan

Analysis Plan Documents

Pre-Analysis+Plan_Abraham.pdf

MD5: 353d64de91db3a9cfaf463d502ef1c6f

SHA1: 544159280ca3e5720494c4518b8ef349a0bf7061

Uploaded At: August 14, 2018

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials