Algorithm Aversion in Forecasting: A Behavioral Investigation

Last registered on July 02, 2017

Pre-Trial

Trial Information

General Information

Title
Algorithm Aversion in Forecasting: A Behavioral Investigation
RCT ID
AEARCTR-0002298
Initial registration date
July 01, 2017

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 02, 2017, 10:55 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2017-07-17
End date
2017-08-17
Secondary IDs
Abstract
The research project is positioned within the field of behavioral decision making and focuses on the topic of algorithm aversion in forecasting decisions. Algorithm aversion describes the phenomenon that many individuals tend to rely on human forecasters and discard algorithms even though the latter are generally performing better. Previous research has shown the reasons for this to be manifold and has also discussed possible managerial interventions. However, it has not been examined so far whether the complexity of the forecast algorithm or the reputation of the algorithm developer/provider influence individual's decision to apply an algorithm. This is of particular relevance today as forecasts are becoming more complex in the wake of big data and machine learning technologies. Since forecasts employing such methods are expected to be more accurate than human forecasts and simple statistical forecasting models, failure to make use of them could be very costly for companies. Therefore, this study examines to what extent individuals trust algorithms along the two dimensions complexity and reputation of forecasts. In other words, how important is it for decision makers to understand an algorithm and are they influenced by the reputation of the
developer/provider of the algorithm. The study will consist of four treatments (full factorial design of high/low algorithm complexity and high/low developer/provider reputation) in which participants will be asked to choose between making an own forecast or following the algorithm. The monetary incentive will be based on forecast accuracy.
External Link(s)

Registration Citation

Citation
Klöckner, Martin. 2017. "Algorithm Aversion in Forecasting: A Behavioral Investigation ." AEA RCT Registry. July 02. https://doi.org/10.1257/rct.2298-1.0
Former Citation
Klöckner, Martin. 2017. "Algorithm Aversion in Forecasting: A Behavioral Investigation ." AEA RCT Registry. July 02. https://www.socialscienceregistry.org/trials/2298/history/19117
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
In the control treatment, participants will be presented with a algorithm of low complexity and low reputation. Participants of the second treatment will be presented with an algorithm of high complexity and still low reputation. In treatments 3 and 4, reputation of the algorithm will be higher in both states of complexity.
Intervention Start Date
2017-07-25
Intervention End Date
2017-07-26

Primary Outcomes

Primary Outcomes (end points)
Usage of the algorithm by participants depending on complexity and reputation of the algorithm.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Four treatments with full factorial design of low / high algorithm complexity and low / high reputation. In each treatment, participants are asked to forecast the demand for a product and place an order for that product for a total of 20 rounds. At the beginning of each round, they are presented with the values of three factors, of which only one is correlated with demand. Based on this information they choose a forecasting method (own forecast or algorithm). With this choice it is also determined how the order amount is being set for this round. Regardless of their choice, they continue by entering their own forecast . However, only if they previously chose their own forecast will this entry determine the order amount. In cases where they selected the algorithm, the algorithm will determine the order amount. At the end of each round, participants are given the results of their own forecast and the algorithm as well as actual demand and their corresponding profit from selling the product.
Experimental Design Details
Randomization Method
Randomization is done by the lab's computer system.
Randomization Unit
Individual
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
4 x 30 randomly selected participants, mostly students of social science and economics
Sample size: planned number of observations
120 participants (students)
Sample size (or number of clusters) by treatment arms
30 participants (students) per treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials