The Impact of Explanatory Strategies on Employee Trust in Algorithmic Forecasts: An Experimental Investigation

Last registered on March 11, 2025

Pre-Trial

Trial Information

General Information

Title
The Impact of Explanatory Strategies on Employee Trust in Algorithmic Forecasts: An Experimental Investigation
RCT ID
AEARCTR-0014040
Initial registration date
August 14, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 14, 2024, 3:52 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
March 11, 2025, 6:30 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Ulm University

Other Primary Investigator(s)

PI Affiliation
Ulm University

Additional Trial Information

Status
In development
Start date
2025-03-11
End date
2025-03-22
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Algorithmically generated information is playing an increasingly central role in managers' decision-making processes. However, the value of these insights is only fully realized when they are effectively utilized. Consequently, the literature on algorithm aversion has explored the factors that influence their adoption. One underexplored phenomenon is managers' tendency to seek reassurance from their superiors. In our experiment, we investigate whether such advice-seeking behavior occurs and whether it can be influenced by providing explanations through explainable AI (XAI). We employ an online experiment involving crowd workers to isolate the effects of different XAI methods (Feature Importance Explanations vs. Counterfactual Explanations) on advice-seeking behavior.
External Link(s)

Registration Citation

Citation
Röder, Andreas and Mischa Seiter. 2025. "The Impact of Explanatory Strategies on Employee Trust in Algorithmic Forecasts: An Experimental Investigation." AEA RCT Registry. March 11. https://doi.org/10.1257/rct.14040-3.0
Experimental Details

Interventions

Intervention(s)
Participants assume the role of a store manager in a convenience store chain. They are tasked with forecasting sales of the upcoming month using a sales development dashboard, which is similar to contemporary business intelligence tools. The assumed store chain uses AI to assist the store manager in the forecasting task. We implement two different explanatory strategies i) Feature Importance Explanations and ii) Counterfactual Explanations along the AI forecasts for users. During the experiment, participants can interact with the sales development dashboard to learn about the store using the information provided by the system, the AI prediction and the explanation associated with their experimental condition.
Intervention (Hidden)

The intervention is designed to investigate the following Hypotheses:

Explanations reduce advice-seeking behavior.
Explanations increase advice-seeking behavior.
Counterfactual Explanations are more effective as compared to Feature Importance Explanations.
Intervention Start Date
2025-03-11
Intervention End Date
2025-03-22

Primary Outcomes

Primary Outcomes (end points)
Advice seeking from supervisors
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Weight of AI advice
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Participants will be setting sales targets for a fictious convenience store. They will be randomly divided into three experimental conditions, each subject to different conditions formulated by our independent variable:

• Algorithmic Advice (AI vs. AI & Feature Importance Explanation vs. AI & Counterfactual Explanation)

On the welcome page of the experiment, we will inform participants about the procedure and potential exclusions through attention checks. They are instructed to immerse themselves into the situation of setting targets for a convenience store. To further incentivize participants, they receive a bonus for setting accurate sales targets.

The experiment can be sectioned into the following steps:
1. Participants receive general information of their store and the stores past sales performance.
2. Participants indicate their sales expectations for the upcoming month.
3. Participants are provided an AI forecast (and a corresponding explanation depending on the experimental condition) of expected sales for the upcoming month.
4. Based on the information provided beforehand, participants must set a sales target for their store.
5. The participants can choose to ask their supervisor for advice. However, advice is costly and reduces their potential bonus payment
6. Participants can readjust their previously selected sales target (based on the information provided by their supervisor).

After completing the target setting process, participants will be asked to answer a post experimental questionnaire, to query their experience and general preferences. Finally, participants are asked to provide sociodemographic information.
Experimental Design Details
Randomization Method
Participants will be randomly assigned to the experimental conditions by a designated function of the website.
Randomization Unit
Randomization will be done at the participant level
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
-
Sample size: planned number of observations
Our sample size will be calculated based on the results of a pre-test using Stata. Expected sample size: 540 participants.
Sample size (or number of clusters) by treatment arms
-
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Ethikkommission der Universität Ulm
IRB Approval Date
2024-07-29
IRB Approval Number
N/A

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials