Labor Market Returns to Upskilling - A Combination of Audit Study and Resume Review

Last registered on April 21, 2020

Pre-Trial

Trial Information

General Information

Title
Labor Market Returns to Upskilling - A Combination of Audit Study and Resume Review
RCT ID
AEARCTR-0005718
Initial registration date
April 20, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 21, 2020, 11:27 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Yale University

Other Primary Investigator(s)

PI Affiliation
Yale University

Additional Trial Information

Status
In development
Start date
2021-06-01
End date
2021-09-01
Secondary IDs
Abstract
The US labor market has experienced significant shifts over the last few decades due to skill-biased technological changes (Autor et al., 2010). This polarization of employment opportunity has exacerbated income inequality across the country, marginalizing a large fraction of the workforce, who do not have the background and resources to pursue high-skilled careers through higher education. One potential solution is to retrain displaced workers and disadvantaged population into technological industries with considerable growth potential. However, there is a lack of evidence on what type of retraining programs are effective in helping disadvantaged groups gain foothold in high-skilled industries. This research project aims to estimate the effectiveness of entry-level online tech certificates in helping individuals move from other sectors into the tech industry. In addition, this project asks whether online tech certificates are equally valuable, more valuable or less valuable for workers without relevant tech-experience and education. Finally, this project also asks whether age interacts with the returns to tech certificates.

Credibly estimating the labor market returns to online certificates poses significant challenges. Many retraining programs (eg. Year Up) have strict admission guidelines for participants, and often work closely with corporate partners to place participants at the end of the program. Therefore, simply comparing income of individuals with and without certificates would lead to biased estimates, because workers who choose to attain certificates are inherently different from those who do not. To address these challenges, this project uses a combination of audit study and Incentivized Resume Rating (IRR) methods. The core of the research design is an audit study that elicits employers’ true preferences for IT certificates. In this step, we first carefully design artificial job applications and randomly assign part of the application profiles to have selected online tech certificates. Then, the resumes are sent out to job openings posted on a large online job search platform in the US, and call-back responses are analyzed. A subset of these artificial applications are further subjected to incentivized resume rating, where third-party resume evaluators decide whether they would call-back these applicants and what salary they would be offered. By analyzing the results on call-back rates from the IRR and audit studies, we first establish a correspondence between the weights that resume evaluators and real employers assign to various observable worker characteristics. Then, this correspondence is applied to resume reviewers' behavior in the IRR salary study to find the expected labor market returns to skill certificates in a hypothetical audit study on salary.

Results from this project would inform policy-makers on the viability of using certifications and related training programs in assisting displaced workers move into the tech industry. Given the increasing rate of technological advancement and changing nature of jobs, these results have potential to guide policies that can reduce income inequality and at the same time create a pool of high-skilled workforce. This project would generate rich datasets on worker profiles typical of young and entry-level job applicants in the tech industry, call-back rates from the audit and IRR studies, and expected offers from the IRR salary study. Summary statistics from this dataset will be of independent interest to organization that are invested in designing and administering online skill certification programs. Methodologically, this project provides the first direct examination of whether third-party resume reviews are a viable alternative to audit studies, and proposes a new method to utilize the relative advantages of both audit study design and resume review studies.
External Link(s)

Registration Citation

Citation
Sinha, Sourav and Zhengren Zhu. 2020. "Labor Market Returns to Upskilling - A Combination of Audit Study and Resume Review." AEA RCT Registry. April 21. https://doi.org/10.1257/rct.5718-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2021-06-01
Intervention End Date
2021-09-01

Primary Outcomes

Primary Outcomes (end points)
Call back response for audit study.
Reviewer decision on call back, job offer, and expected wage.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This project uses a combination of audit study and Incentivized Resume Rating (IRR) methods. The core of the research design is an audit study that elicits employers’ true preferences for IT certificates. In this step, we first carefully design artificial job applications and randomly assign part of the application profiles to have selected online tech certificates. In addition, we randomly assign the applicant to either have or not have previous tech-related experience, and also randomly assign the applicant to either be young (~22 yo) or older (~35 yo). Then, the resumes are sent out to job openings posted on a large online job search platform in the US, and call-back responses are analyzed. A subset of these artificial applications are further subjected to incentivized resume rating, where third-party resume evaluators decide whether they would call-back these applicants and what salary they would be offered. By analyzing the results on call-back rates from the IRR and audit studies, we first establish a correspondence between the weights that resume evaluators and real employers assign to various observable worker characteristics. Then, this correspondence is applied to resume reviewers' behavior in the IRR salary study to find the expected labor market returns to skill certificates in a hypothetical audit study on salary.
Experimental Design Details
Using relevant resumes from the skill requirements in job advertisements, we construct a large number of application profiles. The applications will be divided into six branches. The applicants are first randomly divided into two groups — those with previous tech-industry experience and those without. Then, within each group, we randomly assign applicants to either have or not have a tech certificate. In addition, we randomly assign applicants to either be young (~22 yo) or older (~35 yo). Finally, we drop individuals who are older and also have previous tech experience, since these individuals will be more experienced tech workers who do not compete in the labor market of interest for this study.

The hypothetical job applicants' resumes have three key components — educational background, previous employment, and skills and qualifications. We design our resumes such that all hypothetical applicants have obtained their bachelor's degree from non-selective public four-year institutions. Their degrees will be in fields that are not directly related to the tech industry, but related to industries that are most affected by skill-biased technological changes (e.g. mechanical engineering). Furthermore, the year of graduation will be listed as between 2012 and 2015, so that the applicants will be perceived to have five to eight years of work experience and to be between 25 to 30 years of age. This allows the applications to reflect backgrounds for individuals seeking a career transition, and also avoids the applicants from being subject to the potential age discrimination in the tech industry.

When designing the previous employment experience of the applicants, we rely on job descriptions from real-world hiring advertisement posted on the job-search platform. We scrape the job descriptions of relevant job advertisements, and rephrase and repackage them into descriptions for the applicants' previous employment. Job descriptions for applicants with relevant tech-industry experience will be sourced from job advertisements in our targeted tech industry verticals. The descriptions of experience for applicants without relevant tech-industry experience will come from job advertisements in industries most-affected by skill-biased technological changes (e.g. manufacturing).

In the skills and qualifications section of the resumes, we will include the certification assigned by randomization and also include certain computer/tech skills that are necessary in applying for tech-industry jobs. For example, basic coding knowledge in Java and working knowledge in Microsoft Excel will be included in all resumes.

We search for appropriate job openings through the online search platform we partner with. In conducting the search, we will use key words such as “technology”, “information technology”, or “IT manager”, and also restrict search to jobs requiring zero to three years of tech-related experiences.

We will create a large amount of email accounts to be associated to the job applications. Our study team has established partnership with a US enterprise telecommunication service company that will provide us with a large number of phone numbers with voicemail services. When sending out the job applications, members of the study team will be instructed to not provide any information that are not designed into the resumes. This step is to ensure that the randomization design stays intact and also avoids the risk of raising concerns from the hiring firms.

The study team will record call-backs by regularly monitoring the email accounts and by recording the voicemails in the phone accounts. To avoid the risk of disclosing additional information and exposing the artificial nature of the study, the members of the study team will be instructed to not have direct phone conversations with the employers. In a previous audit study, Zhu (2020) documents that more than 95% of employers leave voicemails with sufficient information to document call backs when they call audit study job applicants. If an employer calls the job applicant for more than three times, the study team will reach out to the hiring office of the employer indicating that she is no longer looking for a job. No interview opportunity or jobs will be accepted by the study team.

In the resume review part of the study, we send a random subset of the artificial job applications to professional resume reviewers. The subset will be representative of the universe of applicant-job advertisement pairs from our audit study, while preserving the four basic groups of job applicants. We also provide a de-identified summary of the job opening associated with each application. Then, we ask the reviewers two questions. First, based on the application and the job description, will you be willing to provide this candidate an interview opportunity? Second, based on the application and the type of job the individual is applying to, what expected salary they would offer? The reviewers are potentially aware of that they are part of an experiment. This is because in most real-life situations of resume rating, reviewer will also have access to the name of the employer and many other identifying details, which we will have to omit from our IRR study because of confidentiality concerns. Reviewers will be hired at market rates.
Randomization Method
Randomization is implemented with random resume generator (Lahey and Beasley, 2007)
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
12000 individual applications
Sample size: planned number of observations
12000 individual applications
Sample size (or number of clusters) by treatment arms
6 types of resumes: no tech experience + no certificate + young, no tech experience + with certificate + young, with tech experience + no certificate + young, with tech experience + with certificate + young, no tech experience + no certificate + older, no tech experience + with certificate + older. 2000 individual applications in each arm. Total of 12000 individual applications.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Power: 0.81; 6 treatment arms lead to 15 pairwise comparison; group A proportion: 0.16; group B proportion: 0.13.
IRB

Institutional Review Boards (IRBs)

IRB Name
Yale University IRB
IRB Approval Date
2020-03-20
IRB Approval Number
2000027608

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials