Defaults vs. Direct Advice

Last registered on October 30, 2018

Pre-Trial

Trial Information

General Information

Title
Defaults vs. Direct Advice
RCT ID
AEARCTR-0003490
Initial registration date
October 25, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 30, 2018, 5:58 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Bonn

Other Primary Investigator(s)

PI Affiliation
BRIQ
PI Affiliation
University of Copenhagen

Additional Trial Information

Status
In development
Start date
2018-10-26
End date
2018-12-31
Secondary IDs
Abstract
The behavioral relevance of non-binding defaults is well established. A central reason for
why defaults may affect choices is that they entail implicit recommendations (McKenzie
et al., 2006; Altmann et al., 2018), i.e., they function as a mode of communication between
default setters and decision makers. This naturally raises the question how communication
through defaults differs from other forms of communication. We conduct a laboratory experiment
to analyze the extent to which default options differ from direct advice.
External Link(s)

Registration Citation

Citation
Altmann, Steffen, Armin Falk and Andreas Grunewald. 2018. "Defaults vs. Direct Advice." AEA RCT Registry. October 30. https://doi.org/10.1257/rct.3490-1.0
Former Citation
Altmann, Steffen, Armin Falk and Andreas Grunewald. 2018. "Defaults vs. Direct Advice." AEA RCT Registry. October 30. https://www.socialscienceregistry.org/trials/3490/history/36521
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2018-10-26
Intervention End Date
2018-12-31

Primary Outcomes

Primary Outcomes (end points)
Please refer to the pre-analysis plan.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We study a simple sender receiver paradigm also used in Altmann et. al 2018. In one treatment condition the first mover specifies a default in the second treatment condition she specifies a message for the decision maker.
Experimental Design Details
Randomization Method
randomization done by a computer
Randomization Unit
session level randomization for the main treatment. Individual level for the second treatment dimension (player role)
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
96 matching groups
Sample size: planned number of observations
576 individuals and 14400 choices
Sample size (or number of clusters) by treatment arms
48 matching groups per treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials