x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
The demand effect in WTP elicitation
Last registered on February 15, 2021

Pre-Trial

Trial Information
General Information
Title
The demand effect in WTP elicitation
RCT ID
AEARCTR-0007210
Initial registration date
February 15, 2021
Last updated
February 15, 2021 11:41 AM EST
Location(s)
Region
Primary Investigator
Affiliation
Other Primary Investigator(s)
Additional Trial Information
Status
In development
Start date
2021-02-18
End date
2021-02-28
Secondary IDs
Abstract
Experimenter demand effect is a concern in all non-natural experiments with human participants, which refers to changes in behavior by participants trying to infer the experimenter’s objective (Zizzo 2010). For example, participants who believe the researcher wants to promote energy-efficient appliance adoption might show higher willingness-to-pay (WTP) than they otherwise would. In this study, we apply the method proposed by De Quidt et al. (2018) to measure the upper and lower bound of experimenter demand effect of an information treatment intervention. By exposing participants to demand treatments, which are likely to be more informative than implicit signals about demand in typical information intervention studies, we measure the strongest possible treatment effect under th
External Link(s)
Registration Citation
Citation
Gao, Yu. 2021. "The demand effect in WTP elicitation." AEA RCT Registry. February 15. https://doi.org/10.1257/rct.7210-1.0.
Experimental Details
Interventions
Intervention(s)
See uploaded file.
Intervention Start Date
2021-02-18
Intervention End Date
2021-02-28
Primary Outcomes
Primary Outcomes (end points)
Willingness-to-pay
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
We will follow the method proposed by De Quidt et al. (2018) to measure the upper and lower bound of experimenter demand effect of an information treatment intervention.
Experimental Design Details
Randomization Method
randomization will be done by the survey platform.
Randomization Unit
individual
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
220 per group.
Sample size: planned number of observations
220 per group * 9 groups
Sample size (or number of clusters) by treatment arms
220 per group * 9 groups
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
power.t.test(n = NULL, delta = 4, sd = 21, power = 0.8, type = "one.sample", alternative ="two.sided")
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
GSM
IRB Approval Date
2021-01-08
IRB Approval Number
2021-04
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS