Impact of AI Skills on Callback in Job Applications: Evidence from a Field Experiment

Last registered on April 25, 2021

Pre-Trial

Trial Information

General Information

Title
Impact of AI Skills on Callback in Job Applications: Evidence from a Field Experiment
RCT ID
AEARCTR-0007097
Initial registration date
March 01, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
March 02, 2021, 6:41 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
April 25, 2021, 5:13 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Humboldt-Universität zu Berlin

Other Primary Investigator(s)

PI Affiliation
Humboldt-Universität zu Berlin
PI Affiliation
Humboldt-Universität zu Berlin

Additional Trial Information

Status
On going
Start date
2021-03-02
End date
2021-05-31
Secondary IDs
Abstract
We investigate the demand of AI skills in real job applications with a correspondence experiment. We send fictitious job applications in response to real job postings for entry-level employees with a business administration background on UK-based online job boards, randomly varying whether the CVs include AI skills. First, we study the effect of these skills on the callback rate. Second, we investigate potential differences across three types of business administration jobs (marketing, HR and finance and accounting).
External Link(s)

Registration Citation

Citation
Danilov, Anastasia, Teo Firpo and Lukas Niemann. 2021. "Impact of AI Skills on Callback in Job Applications: Evidence from a Field Experiment." AEA RCT Registry. April 25. https://doi.org/10.1257/rct.7097-1.2000000000000002
Experimental Details

Interventions

Intervention(s)
We adopt a 3 x 2 design. For each of the three types of business administration jobs (marketing, HR and finance and accounting) we create a CV with cover letter. The documents across the three different job types are comparable regarding skills and experience. Then, we randomly add AI skills to each applicant’s documents or not. Apart from AI skills, the applications within each job type are exactly the same. We then select suitable job postings from UK-based online job boards, in each of the three job categories. We send CVs and cover letters for the specific job type, randomizing whether the CV and cover letter include AI skills.
Intervention Start Date
2021-03-02
Intervention End Date
2021-04-16

Primary Outcomes

Primary Outcomes (end points)
Our primary outcome is callback from the employer. In general, callback is a response from the employer indicating interest in the applicant. We differentiate between two different versions of callback as in Baert et al. (2013).

1. Callback sensu stricto is every callback from an employer that implies a direct job offer or an invitation to an interview. Not considered are callbacks that imply any other positive response, rejections, and no responses.
2. Callback sensu lato includes every callback from an employer that implies any positive response. Not considered are rejections and no responses.

Callbacks are received by email or voice mail.

References

Baert, S., Cockx, B., Gheyle, N. & Vandamme, C. (2013). Do Employers Discriminate Less if Vacancies are Difficult to Fill? Evidence from a Field Experiment. CESifo Working Paper Series 4093, CESifo.
Primary Outcomes (explanation)
For a detailed description of how the variables are constructed, please refer to the pre-analysis plan.

Secondary Outcomes

Secondary Outcomes (end points)
Additionally, we measure the response type. This is an ordinal variable and possible values in descending order are: direct job offer, invitation to an interview, invitation to an interview for an alternative position, phone voice mail with request to call back, invitation to an assessment center, phone call without voice mail message, request for additional information, rejection, no response.

We also measure the response time. This is defined as the time between sending the application and receiving an answer by email or voice mail from the employer.
Secondary Outcomes (explanation)
For a detailed description of how the variables are constructed, please refer to the pre-analysis plan.

Experimental Design

Experimental Design
We use a 3 x 2 experimental design to investigate the effect of AI skills on job search success. First, we create a control CV without AI capabilities including a motivation letter for a fictitious applicant for each job type (marketing, HR and finance and accounting). In a second variant, we add additional AI skills to the applicant profile. In summary, we have two CVs for each of three different job types. For each job type, we search for matching job postings for entry-level employees with a business administration background on UK-based online job boards and randomly send (with vs without AI skills) an applicant profile to the employer. Conditional on being able to obtain the UK company number of the respective company we also collect data on the SIC code and age of the company (using the UK’s Companies House database). We track callback from employers for each observation. We first compare the callback rate (for both, sensu stricto and sensu lato) for all applications including AI skills (across all three job types) to all applications without AI skills. We then compare the callback rate by job type and presence of AI skills.
Experimental Design Details
Randomization Method
The randomization is stratified by job type. Because we cannot control how many job postings for each job type will be available on any given week, and we would like to ensure numerical balance between the treatment and control groups over the timeline, we randomize job postings in blocks of four (e.g., Treatment-Control-Control-Treatment). Between March and April 2021, we will search for suitable job postings in each job type on a weekly basis and apply. We also randomize the order of job types in which we search for, and apply to, job postings (such that we do not always start with e.g., ‘marketing’ and end with ‘HR’, but rather rotate through them on a daily basis). To maintain the integrity of the randomization, we will respect the block nature of the randomization sequence (i.e., ensuring there are at least four applications available before starting with a new block). We will only send applications to job postings that do not immediately require a test or supplementary information (beyond a CV and cover letter), to avoid immediate rejections due to insufficient information. We will not send more than one application to a company, to avoid detection.

Randomization was carried out in Stata (seed 12345); randomization code is available upon request.
Randomization Unit
Job posting
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
NA
Sample size: planned number of observations
600
Sample size (or number of clusters) by treatment arms
300 applications with treatment and 300 applications control, equally split between the three job types (such that each trial arm has 100 applications).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
We conducted sample size calculations using a baseline callback rate of 17%. This number was estimated through a small pilot. With standard numbers for power and significance, we estimate that with a total sample of 600 applications we can detect an effect size of 9.4 percentage points (on our primary outcomes) for the main comparison, i.e., applications with or without AI skills for all three job types. The power calculation was conducted using the tool provided by McConnell and Vera-Hernández (2015). References McConnell, B. & Vera-Hernández, M. (2015). Going Beyond Simple Sample Size Calculations: A Practitioner's Guide. Institute for Fiscal Studies, working paper W15/17.
IRB

Institutional Review Boards (IRBs)

IRB Name
German Association for Experimental Economic Research e.V.
IRB Approval Date
2021-01-29
IRB Approval Number
EJ4BBJir
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
April 16, 2021, 12:00 +00:00
Data Collection Complete
No
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials