Thinking Through Complexity

Last registered on February 18, 2026

Pre-Trial

Trial Information

General Information

Title
Thinking Through Complexity
RCT ID
AEARCTR-0017632
Initial registration date
February 11, 2026

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
February 18, 2026, 6:11 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation
Stanford
PI Affiliation
Stanford

Additional Trial Information

Status
In development
Start date
2026-02-15
End date
2026-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This study investigates how individuals reason in complex economic environments involving indirect and feedback effects. Participants in an online experiment take part in a “transfer game” with three fictitious players and a fixed set of rules that determine how money moves between players over two or three rounds. In each game, participants predict how many coins one player will end up with and earn a bonus based on prediction accuracy. A key feature of the design is that participants do not see the full set of transfer rules directly. Instead, they can consult a simple calculator that reveals the consequence of a specific transfer rule they choose to look up. By observing both participants’ predictions and the information they choose to look up, we study how people simplify complex environments by focusing on some effects and ignoring others, and how these simplifications depend on the complexity of the environment, information costs, and incentives for accuracy.
External Link(s)

Registration Citation

Citation
Bernheim, B. Douglas, Jon Hartley and Vlastimil Rasocha. 2026. "Thinking Through Complexity." AEA RCT Registry. February 18. https://doi.org/10.1257/rct.17632-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2026-02-15
Intervention End Date
2026-12-31

Primary Outcomes

Primary Outcomes (end points)
Transfers considered: Which transfers ijt are considered based on calculator look-ups (sender i, receiver j, round t), including summary measures such as the total number of transfers considered and the number of relevant transfers considered.

Effects considered: Indicators for whether the participant considered each of the three effect types—direct, indirect, and (in the 3-round game) feedback—defined by whether the participant looked up all transfers along at least one path consistent with the effect.

Mental model measures from look-ups: Classification of observations/participants into mental models based on the pattern of calculator look-ups, including:
- Non-parametric model frequency/deviation measures (share of observations matching a given look-up pattern, or within 1–2 deviations).
- Parametric (maximum-likelihood) participant classification into the mental model that best explains look-ups across the three game versions (and the implied deviation/error rate).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This study is an online lab experiment in which participants complete a “transfer game” prediction task. Each game involves three fictitious players (Blue, Red, Yellow) with bank accounts. Yellow starts with 100 coins and the other players start with 0. The game lasts either two or three rounds, depending on treatment. In each round, each player follows a rule that specifies how many coins they transfer to each of the other players, how many they keep, and how many they spend; the rules change by round.

Participants’ task is to predict how many coins will be in Blue’s account at the end of the game. Participants do not see the underlying rules directly. Instead, they can use a custom calculator that reveals the transfers implied by a specific rule they choose to look up (by selecting the round, sender, receiver, and a starting balance). We record participants’ final predictions and the complete history of their calculator look-ups.

The experiment uses a between-subject design with randomized variation in: (i) the number of rounds in the game (two vs. three), (ii) whether calculator look-ups are free or carry a small per-look-up cost, and (iii) the size of the accuracy-based bonus available for the prediction meaningfully. Participants complete multiple versions of the game in random order, and one is randomly selected for payment.
Experimental Design Details
Not available
Randomization Method
Participants are randomly assigned to the following treatment conditions in a between-subject design: (i) number of rounds in the transfer game (2 vs. 3), (ii) whether calculator look-ups are free vs. carry a small per-look-up cost, and (iii) the incentive level (maximum bonus). Randomization is carried in the Qualtrics Survey by a computer.

Participants then complete three game versions in randomized order, and one version is randomly selected by the software for payment.
Randomization Unit
Individual-level randomization. Random assignment is conducted at the participant level (each participant is assigned to a single treatment cell in a between-subject design). There is no cluster- or group-level randomization.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
NA
Sample size: planned number of observations
Approximately 2,250 participants, Each participant completes three versions of the transfer game, yielding approximately 6,750 participant × game-version observations.
Sample size (or number of clusters) by treatment arms
- Two-round game + Look-ups are free + Maximum bonus is $10: ~500
- Two-round game + Each look-up costs $0.01 + Maximum bonus is $10: ~500
- Three-round game + Look-ups are free + Maximum bonus is $10: ~500
- Three-round game + Each look-up costs $0.01 + Maximum bonus is $10: ~500
- Two-round game + Look-ups are free + Maximum bonus is $50: ~250
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Stanford IRB
IRB Approval Date
2025-08-21
IRB Approval Number
IRB-42264
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information