Cognitive Spillovers

Last registered on January 24, 2020

Pre-Trial

Trial Information

General Information

Title
Cognitive Spillovers
RCT ID
AEARCTR-0005044
Initial registration date
January 24, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 24, 2020, 3:59 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation
PI Affiliation

Additional Trial Information

Status
In development
Start date
2020-01-27
End date
2021-06-30
Secondary IDs
Abstract
We conduct a laboratory experiment to investigate how individuals allocate scarce cognitive resources across two tasks. We study how monetary incentives for solving the different tasks influence cognitive resource allocation, task performance, and individuals’ propensity to stay passive and follow a randomly specified default in one of the tasks. Furthermore, we examine how treatments that foster active decision-making in one task affect the quality of individuals’ choices in the treated as well as the non-treated task.
External Link(s)

Registration Citation

Citation
Altmann, Steffen, Andreas Grunewald and Jonas Radbruch. 2020. "Cognitive Spillovers." AEA RCT Registry. January 24. https://doi.org/10.1257/rct.5044-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2020-01-27
Intervention End Date
2020-12-31

Primary Outcomes

Primary Outcomes (end points)
- Rate of passive choices (based on an indicator variable = 1 if a subject follows the default in the decision task in a given period).
- Attention spans devoted to the decision task, measured by the total number of seconds devoted to the task in a given period. Measured for individuals in the “Baseline” environment only.
- Fraction of correctly solved background tasks per individual.
- Fraction of correctly solved decision tasks per individual.
- Total payoff (sum of earnings for the decision task + background task, individual-level averages across 20 rounds).
- Fraction of maximally feasible payoff (“total payoff” / max. payoff)
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The experiment involves 9 treatments in a 3x3 between subjects design. The experiment extends the experiments reported in Altmann et al. (2019). For further details about the tasks, treatments, and outcomes please refer to experimental design section of Altmann et al. (2019).
In each treatment, subjects are confronted with two tasks, denoted as “Background Task” and “Decision Task”, which they solve simultaneously.

As background task, subjects have to memorize 7-digit numbers. At the beginning of each round of the experiment, a new number is displayed for 10 seconds on subjects’ screens. Subsequently, the number disappears and subjects have to keep it in mind. After another 30 seconds, subjects have to type in the memorized number in an input field on their screen. In the 30 seconds between memorizing the number and entering the memorized number, subjects can work on the decision task. As decision task, subjects face three summations, each of which consists of six addends. Their task is to decide which of the three options yields the highest sum.

The first treatment dimensions varies the monetary incentives for completing the tasks correctly:
1) In treatments with a “40/10” incentive, subjects earn € 0.40 [€ 0.10] if they correctly solve the background task [decision task] in a given round.
2) In treatments with a “40/20” incentive, subjects earn € 0.40 [€ 0.20] if they correctly solve the background task [decision task] in a given round.
3) In treatments with a “40/40” incentive, subjects earn € 0.40 [€ 0.40] if they correctly solve the background task [decision task] in a given round.

The second treatment dimension varies how the decision task is displayed to subjects:
1) In treatments involving the “Baseline” environment, subjects can access the decision task by holding a button on the keyboard pressed. In addition, the decision task features a default option that is implemented if subjects do not make an active decision. In particular, in each round, one option of the decision task is randomly selected and displayed as the default choice.
2) In treatments involving the “Directed Attention” environment, the decision task is permanently displayed on subjects’ screens during the 30 seconds in which they keep the number for the background task in mind. Similar to the “Baseline” environment, the decision task in the Directed Attention environment involves a randomly selected default option.
3) In treatments involving the “Active Choice” environment, the decision task is permanently displayed on subjects’ screens during the 30 seconds in which they keep the number for the background task in mind. Moreover, the decision task in the Active Choice environment involves no default.

Subjects participate in only 1 treatment cell (defined by an “incentive” / “decision environment” combination). In the main part of the experiment, subjects play 20 rounds of the respective treatment. The tasks, numbers, defaults, and their order are identical across all subjects and treatments.

Before participating in the main experiment, subjects participate in two additional parts of the experiments in which they only face the background task (Part 1) and decision task (Part 2), respectively. In part 1, subjects play 10 rounds in which they have to memorize numbers of varying difficulty (between 7 to 8 digits).They receive € 0.1 per correctly solved task. In part 2, subjects play 10 rounds of the decision task. They receive € 0.1 per correctly solved task.

References:
Altmann, S., Grunewald, A., & Radbruch, J. (2019). Passive Choices and Cognitive Spillovers. IZA Discussion Paper No. 12337

Experimental Design Details
Randomization Method
randomization done in office by a computer
Randomization Unit
individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
2250 individuals
Sample size: planned number of observations
2250 individuals, 20 rounds per individual
Sample size (or number of clusters) by treatment arms
We aim at n=300 participants in each of the following treatment cells:
40/10 Baseline, 40/10 Directed Attention, 40/10 Active Choice
40/20 Baseline, 40/20 Directed Attention, 40/20 Active Choice

We aim at n=150 participants in each of the following treatment cells:
40/40 Baseline, 40/40 Directed Attention, 40/40 Active Choice
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials