x

The AEA RCT Registry will be down for maintenance on Tuesday, April 20th, from 9pm EDT to 11pm EDT to perform necessary upgrades. The site will be in maintenance mode during that time. We apologize for any inconvenience this may cause.
Impact of AI skills on job application outcomes: Evidence from a field experiment
Last registered on March 21, 2021

Pre-Trial

Trial Information
General Information
Title
Impact of AI skills on job application outcomes: Evidence from a field experiment
RCT ID
AEARCTR-0007097
Initial registration date
March 01, 2021
Last updated
March 21, 2021 9:14 AM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
Humboldt-Universität zu Berlin
Other Primary Investigator(s)
PI Affiliation
Humboldt-Universität zu Berlin
PI Affiliation
Humboldt-Universität zu Berlin
Additional Trial Information
Status
In development
Start date
2021-03-02
End date
2021-04-30
Secondary IDs
Abstract
We investigate the demand of AI skills in real job applications with a correspondence experiment. We send fictitious job applications in response to real job postings for entry-level employees with a business administration background on UK-based online job boards, randomly varying whether the CVs include AI skills. First, we study the effect of these skills on the callback rate. Second, we investigate potential differences across three types of business administration jobs (marketing, HR and finance and accounting).
External Link(s)
Registration Citation
Citation
Danilov, Anastasia, Teo Firpo and Lukas Niemann. 2021. "Impact of AI skills on job application outcomes: Evidence from a field experiment." AEA RCT Registry. March 21. https://doi.org/10.1257/rct.7097-1.1.
Experimental Details
Interventions
Intervention(s)
We adopt a 3 x 2 design. For each of the three types of business administration jobs (marketing, HR and finance and accounting) we create a CV with cover letter. The documents across the three different job types are comparable regarding skills and experience. Then, we randomly add AI skills to each applicant’s documents or not. Apart from AI skills, the applications within each job type are exactly the same. We then select suitable job postings from UK-based online job boards, in each of the three job categories. We send CVs and cover letters for the specific job type, randomizing whether the CV and cover letter include AI skills.
Intervention Start Date
2021-03-02
Intervention End Date
2021-04-30
Primary Outcomes
Primary Outcomes (end points)
Our primary outcome is callback from the employer. In general, callback is a response from the employer indicating interest in the applicant. We differentiate between two different versions of callback as in Baert et al. (2013).

1. Callback sensu stricto is every callback from an employer that implies a direct job offer or an invitation to an interview. Not considered are callbacks that imply any other positive response, rejections, and no responses.
2. Callback sensu lato includes every callback from an employer that implies any positive response. Not considered are rejections and no responses.

Callbacks are received by email or voice mail.

References

Baert, S., Cockx, B., Gheyle, N. & Vandamme, C. (2013). Do Employers Discriminate Less if Vacancies are Difficult to Fill? Evidence from a Field Experiment. CESifo Working Paper Series 4093, CESifo.
Primary Outcomes (explanation)
For a detailed description of how the variables are constructed, please refer to the pre-analysis plan.
Secondary Outcomes
Secondary Outcomes (end points)
Additionally, we measure the response type. This is an ordinal variable and possible values in descending order are: direct job offer, invitation to an interview, invitation to an interview for an alternative position, phone voice mail with request to call back, invitation to an assessment center, phone call without voice mail message, request for additional information, rejection, no response.

We also measure the response time. This is defined as the time between sending the application and receiving an answer by email or voice mail from the employer.
Secondary Outcomes (explanation)
For a detailed description of how the variables are constructed, please refer to the pre-analysis plan.
Experimental Design
Experimental Design
We use a 3 x 2 experimental design to investigate the effect of AI skills on job search success. First, we create a control CV without AI capabilities including a motivation letter for a fictitious applicant for each job type (marketing, HR and finance and accounting). In a second variant, we add additional AI skills to the applicant profile. In summary, we have two CVs for each of three different job types. For each job type, we search for matching job postings for entry-level employees with a business administration background on UK-based online job boards and randomly send (with vs without AI skills) an applicant profile to the employer. Conditional on being able to obtain the UK company number of the respective company we also collect data on the SIC code and age of the company (using the UK’s Companies House database). We track callback from employers for each observation. We first compare the callback rate (for both, sensu stricto and sensu lato) for all applications including AI skills (across all three job types) to all applications without AI skills. We then compare the callback rate by job type and presence of AI skills.
Experimental Design Details
Not available
Randomization Method
The randomization is stratified by job type. Because we cannot control how many job postings for each job type will be available on any given week, and we would like to ensure numerical balance between the treatment and control groups over the timeline, we randomize job postings in blocks of four (e.g., Treatment-Control-Control-Treatment). Between March and April 2021, we will search for suitable job postings in each job type on a weekly basis and apply. We also randomize the order of job types in which we search for, and apply to, job postings (such that we do not always start with e.g., ‘marketing’ and end with ‘HR’, but rather rotate through them on a daily basis). To maintain the integrity of the randomization, we will respect the block nature of the randomization sequence (i.e., ensuring there are at least four applications available before starting with a new block). We will only send applications to job postings that do not immediately require a test or supplementary information (beyond a CV and cover letter), to avoid immediate rejections due to insufficient information. We will not send more than one application to a company, to avoid detection.

Randomization was carried out in Stata (seed 12345); randomization code is available upon request.
Randomization Unit
Job posting
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
NA
Sample size: planned number of observations
600
Sample size (or number of clusters) by treatment arms
300 applications with treatment and 300 applications control, equally split between the three job types (such that each trial arm has 100 applications).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
We conducted sample size calculations using a baseline callback rate of 17%. This number was estimated through a small pilot. With standard numbers for power and significance, we estimate that with a total sample of 600 applications we can detect an effect size of 9.4 percentage points (on our primary outcomes) for the main comparison, i.e., applications with or without AI skills for all three job types. The power calculation was conducted using the tool provided by McConnell and Vera-Hernández (2015). References McConnell, B. & Vera-Hernández, M. (2015). Going Beyond Simple Sample Size Calculations: A Practitioner's Guide. Institute for Fiscal Studies, working paper W15/17.
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
German Association for Experimental Economic Research e.V.
IRB Approval Date
2021-01-29
IRB Approval Number
EJ4BBJir
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information