AI perception in personnel selection: Stakeholder perspective

Last registered on April 13, 2023


Trial Information

General Information

AI perception in personnel selection: Stakeholder perspective
Initial registration date
April 07, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 13, 2023, 4:04 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
April 13, 2023, 4:15 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.



Primary Investigator

University of Zurich

Other Primary Investigator(s)

PI Affiliation

Additional Trial Information

On going
Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
AI-based personell selection tools are efficient. However, they can generate disparities in selection rates across groups of applicants and therefore violate fairness requirements. In this project we investigate how people perceive this efficiency-fairnes tradeoffs. After having observed people's general fairness evaluation and pferences over efficient and fair algorithms, in this study we are interested in how the stakeholder perspective affects the AI perceptions in HR context. For this purpose, we compare the tradeoff perceptions by the UK managers and students in the role of job applicants. Based on the prominent fairness theory in organizational justice, we expect that managers prefer more efficient algorithms, while students favour les efficient algorithms with lower disparity rates.
External Link(s)

Registration Citation

Kandul, Serhiy and Ulrich Leicht-Deobald. 2023. "AI perception in personnel selection: Stakeholder perspective." AEA RCT Registry. April 13.
Experimental Details


We manipulate stakeholder perspective and see how this affects people's preferences over efficient and fair selection algorithms
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Fairness perceptions; preferences (choices) over algorithms
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We present participants with a series of selection algorithms. The algorithms are defined by two major features: efficiency, measured as the quality of the selection, and faireness, measured by the degree of disparity in selection rates across gender. There is always efficiency-fairnes tradeoffs. We manipulate the stakeholder perspective (between-subjects): the managers take the role of HR managers in the company, while students play as job applicants and the degree of the efficiency-fairness tradeoffs (within-subject). We then compare the fairness perceptions and choices over algorithms of managers and students.
Experimental Design Details
Randomization Method
randomization is done by software Qualtrics
Randomization Unit
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
Sample size (or number of clusters) by treatment arms
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials