The Supply Effects of Additional Screening in Recruitment

Last registered on July 17, 2024

Pre-Trial

Trial Information

General Information

Title
The Supply Effects of Additional Screening in Recruitment
RCT ID
AEARCTR-0013356
Initial registration date
April 12, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 16, 2024, 3:15 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
July 17, 2024, 4:07 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
Monash University

Other Primary Investigator(s)

PI Affiliation
Monash University
PI Affiliation
University of Gothenburg
PI Affiliation
University of Exeter

Additional Trial Information

Status
Completed
Start date
2024-04-22
End date
2024-05-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
In this project, we study the impact of screening stages in recruitment on applicant behavior.
External Link(s)

Registration Citation

Citation
Avery, Mallory et al. 2024. "The Supply Effects of Additional Screening in Recruitment." AEA RCT Registry. July 17. https://doi.org/10.1257/rct.13356-2.0
Experimental Details

Interventions

Intervention(s)
In the project we study whether job applicants respond to different screening methods during the recruitment process for a real job. To study this we experimentally vary the mode of screening applicants.

We consider this as the supply side experiment. A future demand side experiment will be pre-registered prior to collecting the demand side data.
Intervention Start Date
2024-04-22
Intervention End Date
2024-05-30

Primary Outcomes

Primary Outcomes (end points)
We collect the following primary outcomes on the supply side:
• Initial Dropout
• Screening Dropout
Primary Outcomes (explanation)
Initial Dropout – upon receiving an email to take part in a screening task (or in the control continued interest in the job), applicants must click a link/button that confirms their interest. This variable is equal to one if the applicant does not click the button/link and zero if they click the link/button.

Screening Dropout: This is a dummy variable equal to one if the candidate does not complete the screening. A candidate is said to have completed the assessment if they answer all interview questions.

We will analyze these outcomes overall and by job.

Secondary Outcomes

Secondary Outcomes (end points)
To understand possible mechanisms, we elicit the following secondary outcome variables:

Evaluation scores: This is the score of the assessment calculated by the algorithm/human.

Time taken: This is calculated as the time the candidate spends completing the whole screening.

Type of language used: We will use an algorithm to assess the type of language used in applicants’ responses.

Fairness/bias perceptions : We will elicit applicants’ perceptions of whether the interview process they experienced was fair or biased

Interview anxiety: We will elicit applicants’ interview anxiety through the index of McCarthy and Goffin (2004).

Measure of perceived meritocracy of the interview process: We will elicit applicants’ perceptions of meritocracy for different types of interview methodologies.

Recommendation intention: We will elicit the applicants’ willingness to recommend the employer as an indicator of perception about the employer through a modified Reichheld (2003) measure.
Secondary Outcomes (explanation)
See above

Experimental Design

Experimental Design
Our design aims to measure the impact various types of additional screening stages in the recruitment process on the behavior of job applicants.
Experimental Design Details
The design consists of two stages.

In stage 1, we will post three job ads for three real jobs: a web developer, a programmer, and a content creator. The positions will be advertised across the United States at several job portals (e.g. dice.com, indeed.com) for a period of approximately two months each. To apply job applicants must send their CV and fill out a short application (e.g. years of experience programming, demographics). Applicants must reside in the United States.

In the 2nd stage, after applications close, we will randomize applicants into either the control or one of the supply side treatments. We will then email all applicants. In this email we will provide the timeline applicants can expect, which is the next step (if they are in a treatment), an interview with a select group of candidates, and then hiring. In that email we will also provide the link for applicants to take their next step (this depends on the treatment they have been assigned). We will record if applicants click this link. If in the control, this is just a link to indicate continued interest in the position. In the treatments, this is a link to the assessment. The assessment involves responding to standard interview questions; the same interview questions will be used regardless of treatment. This assessment will take place on Hirevue for the Audio Interview, Video Interview, Audio Interview with Human Evaluation, and Video Interview with Human Evaluation treatments. For the Zoom interview, the interview will be scheduled using a scheduling website and take place on zoom. The Zoom Interviews will be done by people with prior interviewing experience who will be blind to the study purpose. After completing the assessment applicants will be directed to a short applicant experience survey that will measure attitudes including attitudes towards the different assessments. We will also conduct a similar survey on those who drop out.

We will then follow the rest of the hiring process as outlined in the email sent to applicants.

We will also conduct a demand side experiment. Further details about this experiment will be provided in separate pre-analysis plan, after supply side stage 2 data has been collected but before commencing the demand side experiment.
Randomization Method
Randomization will be carried out by a computer.
Randomization Unit
The randomization unit will be the individual for all treatments.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Treatment is assigned at the level of the individual so clustering will take place at the individual level.
Sample size: planned number of observations
The clusters will be equal to the number of legitimate candidates. Based on previous experience we expect to have at least 2400 legitimate job applicants across 3 jobs, but this will likely vary depending on a number of factors. However, we plan to keep the jobs open until we have at least 2400 applicants. A legitimate candidate is someone who resides in the United States and completes the initial application form. We plan to assign an equal proportion of the sample to each treatment except for the zoom treatment, which will have a smaller assignment depending on the available evaluating resources. Further, we will stratify by gender such that there is the same proportion of men/women in each treatment.
Sample size (or number of clusters) by treatment arms
We plan to have at least 2400 applicants. We will split the applicants equally across the treatments except for the zoom treatment, which will be based on available resources.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Assuming a sample of 400 in each treatment, power of 80% and assuming the proportion of the study population that would have a value of 1 for the binary outcome in the absence of the treatment is 0.50, we expect a MDE of 0.10. The MDE for the zoom treatment will be larger than the other treatments, which implies a potential underpowering in this treatment. This situation arises due to the practical challenges associated with conducting interviews for a substantial number of candidates.
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
Monash University Ethics Committee
IRB Approval Date
2018-03-07
IRB Approval Number
N/A

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials