x

We are happy to announce that all trial registrations will now be issued DOIs (digital object identifiers). For more information, see here.
Nudges in "Equilibrium"
Last registered on April 28, 2019

Pre-Trial

Trial Information
General Information
Title
Nudges in "Equilibrium"
RCT ID
AEARCTR-0002435
Initial registration date
December 21, 2017
Last updated
April 28, 2019 5:49 AM EDT
Location(s)
Region
Primary Investigator
Affiliation
Yale University
Other Primary Investigator(s)
Additional Trial Information
Status
In development
Start date
2018-01-01
End date
2019-08-31
Secondary IDs
Abstract
"Nudges" have increasingly shown to be cost-effective tools for promoting a wide range of behaviors, from medication adherence to saving to energy efficiency. But most research evaluates one intervention in isolation on target outcomes. As such, we have little understanding of how campaigns might interact with one another, or whether they generate spillovers in unanticipated domains. This paper explores the hypothesis that such campaigns might interfere with one another due to limited attention. I propose a simple framework, motivated by a taxonomy of attention from the psychology literature that distinguishes between “internal” and “external” attention. I test the predictions of the framework using an experiment in which individuals receive combinations of messages and incentives for two healthy behaviors.
External Link(s)
Registration Citation
Citation
Trachtman, Hannah. 2019. "Nudges in "Equilibrium"." AEA RCT Registry. April 28. https://doi.org/10.1257/rct.2435-6.0.
Former Citation
Trachtman, Hannah. 2019. "Nudges in "Equilibrium"." AEA RCT Registry. April 28. https://www.socialscienceregistry.org/trials/2435/history/45554.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
There are two interventions that we will implement for two health behaviors in various combinations across treatment arms. The first is a messaging program: subjects will receive two daily messages about the behavior, one that is a simple reminder, and another that contains information about the behavior's benefits. The second is an incentive program: participants will earn one lottery ticket for every day they successfully do the behavior, and winning tickets will be drawn at the end of the treatment period. Both interventions will last 4 weeks.
Intervention Start Date
2019-01-02
Intervention End Date
2019-02-23
Primary Outcomes
Primary Outcomes (end points)
Whether or not participants engaged in each of the two behaviors, at the individual-day level
Primary Outcomes (explanation)
See pre-analysis plan.
Secondary Outcomes
Secondary Outcomes (end points)
Expectations, opt-outs, response to a surprise raffle via SMS, score on quiz about information sent via messages, health
Secondary Outcomes (explanation)
See pre-analysis plan.
Experimental Design
Experimental Design
Some participants will receive message or incentive programs for only one behavior, and some will receive messaging programs for both behaviors. The key outcome of interest will be whether or not participants engage in each action at the individual-day level. By looking at spillovers and interactions between interventions, we will be able to distinguish between two types of limited attention, internal and external.
Experimental Design Details
See pre-analysis plan.
Randomization Method
Randomization is done in office by a computer with re-randomization (see Pre Analysis Plan)
Randomization Unit
Individual
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
n/a
Sample size: planned number of observations
3780
Sample size (or number of clusters) by treatment arms
Each treatment arm will have a slightly different size, computed in a power calculation that depended on various factors (minimum detectable effect, serial correlation of the outcome, variance of the outcome).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
The minimum detectable effect size for both spillovers and crowd-out is a 35% reduction in a behavior. This will translate into different units and standardized effects depending on the outcome.
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Yale Human Subjects Committee
IRB Approval Date
2018-12-17
IRB Approval Number
2000021379
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers