Strategic Interactions with the Assistant of an Algorithm: Information Acquisition Treatment

Last registered on October 25, 2022


Trial Information

General Information

Strategic Interactions with the Assistant of an Algorithm: Information Acquisition Treatment
Initial registration date
October 18, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 25, 2022, 10:47 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

Wuhan University

Other Primary Investigator(s)

PI Affiliation
Wuhan University
PI Affiliation
Wuhan University
PI Affiliation
Wuhan University

Additional Trial Information

On going
Start date
End date
Secondary IDs
Prior work
This trial is based on or builds upon one or more prior RCTs.
We present a laboratory experiment that examines individuals’ willingness to take advice from algorithms provided by artificial intelligence (AI). We explored two distinguished channels behind the algorithm, the data and the mechanism. In this additional experiment, we will examine individual willingness to pay for the algorithm advice using a within-subject information acquisition treatment design. We will also compare welfare effects, in terms of subject payoff, with regard to information provision and acquisition. The benchmark for comparison where no information is available is borrowed from a previously conducted RCT which we also pre-registered on this website.
External Link(s)

Registration Citation

Bai, Lu et al. 2022. "Strategic Interactions with the Assistant of an Algorithm: Information Acquisition Treatment." AEA RCT Registry. October 25.
Experimental Details


This is a laboratory experiment, we will randomly invite the student subjects to one of the randomized experimental treatments.
Subjects will be instructed to play a repeated strategic interaction game and will have the opportunity to buy advice from an algorithm.
Subjects' payoff will be determined by the decision they and their counterpart made in the game and the amount they paid for the algorithm.
In the experiment, the maximum amount participants could pay will be lower than the minimum amount they could get from the interaction game.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Willingness to pay (WTP) for the algorithm
Primary Outcomes (explanation)
WTP will be generated from the BSR method in the information acquisition design.

Secondary Outcomes

Secondary Outcomes (end points)
Subject payoff (welfare analysis)
Secondary Outcomes (explanation)
We will compare the welfare effect of information provision and acquisition by comparing subject payoff without AI (in a pilot experiment we run earlier without algorithm advice), with free information rounds; or costly information rounds.

Experimental Design

Experimental Design
Our experimental design combines between- and within-subject design.
In the control group, subjects will be instructed to play the one-shot centipede game for a repeated iteration of 10 rounds. Among these, 5 rounds will be played with free algorithm advice and 5 rounds will be played with costly algorithm advice. The order of these 5 rounds will be counterbalanced across sessions.
In the treatment group, subjects will play the same centipede game for 10 rounds with a costly-to-free/free-to-costly counterbalance. The only difference is that in the treatment group, subjects will be informed of the underlying data (including 5 aspects of main determinants) that the algorithm used to generate the predictions.
Experimental Design Details
Randomization Method
Randomization is conducted by a computer.
Randomization Unit
Experimental sessions
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Each treatment will have 4 sessions
Sample size: planned number of observations
Each session will have 15-25 student subjects
Sample size (or number of clusters) by treatment arms
We are planning to have 60-80 subjects per treatment. In total there will be 2 treatments, which will give us 120-160 subjects in total.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
Center of Behavior and Economic Research
IRB Approval Date
IRB Approval Number


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials