Strategic Interactions with the Assistant of an Algorithm: Information Acquisition Treatment

Last registered on October 25, 2022

Pre-Trial

Trial Information

General Information

Title
Strategic Interactions with the Assistant of an Algorithm: Information Acquisition Treatment
RCT ID
AEARCTR-0010264
Initial registration date
October 18, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 25, 2022, 10:47 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Wuhan University

Other Primary Investigator(s)

PI Affiliation
Wuhan University
PI Affiliation
Wuhan University
PI Affiliation
Wuhan University

Additional Trial Information

Status
On going
Start date
2022-05-01
End date
2022-12-01
Secondary IDs
Prior work
This trial is based on or builds upon one or more prior RCTs.
Abstract
We present a laboratory experiment that examines individuals’ willingness to take advice from algorithms provided by artificial intelligence (AI). We explored two distinguished channels behind the algorithm, the data and the mechanism. In this additional experiment, we will examine individual willingness to pay for the algorithm advice using a within-subject information acquisition treatment design. We will also compare welfare effects, in terms of subject payoff, with regard to information provision and acquisition. The benchmark for comparison where no information is available is borrowed from a previously conducted RCT which we also pre-registered on this website.
External Link(s)

Registration Citation

Citation
Bai, Lu et al. 2022. "Strategic Interactions with the Assistant of an Algorithm: Information Acquisition Treatment." AEA RCT Registry. October 25. https://doi.org/10.1257/rct.10264-1.0
Experimental Details

Interventions

Intervention(s)
This is a laboratory experiment, we will randomly invite the student subjects to one of the randomized experimental treatments.
Subjects will be instructed to play a repeated strategic interaction game and will have the opportunity to buy advice from an algorithm.
Subjects' payoff will be determined by the decision they and their counterpart made in the game and the amount they paid for the algorithm.
In the experiment, the maximum amount participants could pay will be lower than the minimum amount they could get from the interaction game.
Intervention Start Date
2022-10-21
Intervention End Date
2022-11-30

Primary Outcomes

Primary Outcomes (end points)
Willingness to pay (WTP) for the algorithm
Primary Outcomes (explanation)
WTP will be generated from the BSR method in the information acquisition design.

Secondary Outcomes

Secondary Outcomes (end points)
Subject payoff (welfare analysis)
Secondary Outcomes (explanation)
We will compare the welfare effect of information provision and acquisition by comparing subject payoff without AI (in a pilot experiment we run earlier without algorithm advice), with free information rounds; or costly information rounds.

Experimental Design

Experimental Design
Our experimental design combines between- and within-subject design.
In the control group, subjects will be instructed to play the one-shot centipede game for a repeated iteration of 10 rounds. Among these, 5 rounds will be played with free algorithm advice and 5 rounds will be played with costly algorithm advice. The order of these 5 rounds will be counterbalanced across sessions.
In the treatment group, subjects will play the same centipede game for 10 rounds with a costly-to-free/free-to-costly counterbalance. The only difference is that in the treatment group, subjects will be informed of the underlying data (including 5 aspects of main determinants) that the algorithm used to generate the predictions.
Experimental Design Details
Randomization Method
Randomization is conducted by a computer.
Randomization Unit
Experimental sessions
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
Each treatment will have 4 sessions
Sample size: planned number of observations
Each session will have 15-25 student subjects
Sample size (or number of clusters) by treatment arms
We are planning to have 60-80 subjects per treatment. In total there will be 2 treatments, which will give us 120-160 subjects in total.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Center of Behavior and Economic Research
IRB Approval Date
2022-10-17
IRB Approval Number
IRB202200163

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials