Survey uptake decisions with a transparent default

Last registered on May 25, 2021

Pre-Trial

Trial Information

General Information

Title
Survey uptake decisions with a transparent default
RCT ID
AEARCTR-0007715
Initial registration date
May 24, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 25, 2021, 4:23 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
U.S. General Services Administration

Other Primary Investigator(s)

PI Affiliation
Harvard University
PI Affiliation
Florida State University

Additional Trial Information

Status
In development
Start date
2021-05-24
End date
2021-09-01
Secondary IDs
Abstract
Default selections are effective at encouraging desired choices among respondents, but may restrict autonomy if respondents are unaware of the default selection or its purpose. Transparency when deploying default selections may be a solution. Evidence suggests that transparent defaults may increase the effectiveness of default selections by increasing the perception that the endorser is fair and trustworthy or providing justification for the preferred choice. This study will test whether presenting a transparent default choice increases survey respondent sign-up for future surveys. The goal of this study is to learn whether disclosing the purpose of a default selection affects the choice to be contacted for future surveys.
External Link(s)

Registration Citation

Citation
Bell, Elizabeth, Michael Hand and Mattie Toma. 2021. "Survey uptake decisions with a transparent default." AEA RCT Registry. May 25. https://doi.org/10.1257/rct.7715-1.0
Experimental Details

Interventions

Intervention(s)
A survey question with a pre-selected response will include a transparency statement that describes why that response has been preselected.
Intervention (Hidden)
At the conclusion of a survey respondents will see the following question:
“Thank you for completing this survey!

Can we contact you about similar future surveys? You'd never get more than one invitation a month - usually much less often. It would always be your choice whether or not to answer each survey.

These surveys are part of work the Office of Evaluation Sciences does to improve government services by using behavioral insights, and we'd be very grateful for your help.”

Below this question respondents will have the option of selecting “Yes you can contact me again,” or “No please don’t contact me again.”

For all respondents the “Yes” option will be preselected (the default selection). Respondents in the treatment group will see the following transparency statement with the question text: "NOTE: We have preselected this option because we want to have enough respondents for future surveys to help build evidence to improve government services."
Intervention Start Date
2021-05-24
Intervention End Date
2021-09-01

Primary Outcomes

Primary Outcomes (end points)
Respondent selection of the "Yes" response (vs. selection of the "No" response).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Survey respondents will be randomly assigned to one of two treatment arms, a standard or transparent default choice.
Experimental Design Details
Randomization Method
Simple randomization will be conducted within the Qualtrics survey platform.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
500
Sample size: planned number of observations
500
Sample size (or number of clusters) by treatment arms
250 individuals control and 250 individuals treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
With a sample size of N=500 respondents and a base rate of uptake of 0.5, the study can detect changes in uptake as small as 12.5 percentage points (with 80% power).
IRB

Institutional Review Boards (IRBs)

IRB Name
Harvard University-Area Committee on the Use of Human Subjects
IRB Approval Date
2021-02-01
IRB Approval Number
IRB21-0002 (Note - this study was deemed Not Human Subjects Research)

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials