Why Default Nudge Works

Last registered on April 26, 2024

Pre-Trial

Trial Information

General Information

Title
Why Default Nudge Works
RCT ID
AEARCTR-0012458
Initial registration date
April 23, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 26, 2024, 12:28 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Tokyo

Other Primary Investigator(s)

PI Affiliation
Hong Kong University of Science and Technology
PI Affiliation
Yale University
PI Affiliation
Kindai University
PI Affiliation
University of Zurich
PI Affiliation
Araya Inc.

Additional Trial Information

Status
In development
Start date
2024-02-05
End date
2024-07-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We study the mechanism of why default nudge works. To investigate this question, we consider three factors suggested in the literature; ease, endowment, and endorsement. To consider how these factors affect default nudge, we conduct an online experiment and a fMRI experiment. In these experiments, we ask series of binary choice questions to the subjects in which subject faces the five treatments: i) no default nudge, ii) simple default nudge where one alternative is selected as default, iii) default nudge with ease where the questions are asked with many words so that the subject will find it easy to choose the default choice, iv) default nudge with endowment where the default choice is associated with endowment effect, and v) default nudge with endorsement where the default choice is associated with endorsement in the question. After understanding how these factors affect the default nudge, we employ an fMRI experiment to explore the neural basis of the emotional aspect of the mechanism of how these factors influence the effectiveness of the default nudge.







External Link(s)

Registration Citation

Citation
Chikazoe, Junichi et al. 2024. "Why Default Nudge Works." AEA RCT Registry. April 26. https://doi.org/10.1257/rct.12458-1.0
Sponsors & Partners

Sponsors

Experimental Details

Interventions

Intervention(s)
We ask our respondents about repeated binary choice questions. Our interventions are about "default" options for binary choice questions.

Specifically, we have one control condition in which subjects make binary choice without default nudge, and the following four treatment conditions in which the subjects make binary decisions with the following default nudge:
i) simple default nudge where one alternative is selected as default,
ii) default nudge with ease where the questions are asked with many words so that the subject will find it easy to choose the default choice,
iii) default nudge with endowment where the default choice is associated with endowment effect, and
iv) default nudge with endorsement where the default choice is associated with endorsement in the question.

With the design, we investigate how much the default nudges increase the propensity to select "favorable" options and what would be the mechanism when default nudge works.
Intervention (Hidden)
Intervention Start Date
2024-02-05
Intervention End Date
2024-02-19

Primary Outcomes

Primary Outcomes (end points)
Our primary outcome variable for the survey experiment is the choices for the binary choice questions.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Subjects in our online survey experiment (main study) respond to 10 binary choice questions. The contexts of binary choice questions are the ones used in extant default nudge studies, such as whether to participate in activities related to environmental conservation, join a pension plan, or donate to charitable causes. There are five types of question modalities, and our respondents answer two for each modality, resulting in 10 total questions.

For each of the five modalities, the subjects are randomly assigned to one of the following:
1) Control where there is no default nudge,
2) simple default nudge in which one alternative is selected as default,
3) default nudge with ease where the questions are asked with many words so that the subject will find it easy to choose the default choice,
4) default nudge with endowment where the default choice is associated with endowment effect, and
5) default nudge with endorsement where the default choice is associated with endorsement in the question.

Patterns (3) - (5) are prepared to investigate potential mechanism why default nudge may work, that is the effect of ease, that of persuasion, and that of endowment, respectively.

In our analysis, after summarizing the descriptive statistics of the experiment, we regress the choices in each question onto the dummy variables for treatments (2) to (5) with and without different types of controls.

For fMRI experiment, each subject answers 50 questions, that is treatments (1) to (5) for two questions for each 5 modalities. For fMRI experiment, analysis using fMRI images will be conducted.
Experimental Design Details
Randomization Method
Randomization is done in an office by a computer.
Randomization Unit
The unit of randomization is at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
1500 individuals for online experiment and 35 individuals for fMRI experiment
Sample size: planned number of observations
1500 individuals for online experiment and 50 for fMRI experiment
Sample size (or number of clusters) by treatment arms
For the online survey experiment, we have five different modality of questions, which we regard to be different experiments.
As each subject answer 10 total questions, each subject answers two questions for each modality. We manipulate them with how defaults are set for these binary questions. Since there are five different treatment conditions (including a control condition), each treatment arm for the online survey has (1500x10)/(5x5) = 120.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
National Institute for Physiological Sciences
IRB Approval Date
2024-01-17
IRB Approval Number
EC01-069
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials