x

We are happy to announce that all trial registrations will now be issued DOIs (digital object identifiers). For more information, see here.
Time involvement and consumers' willingness to pay for information
Last registered on February 16, 2018

Pre-Trial

Trial Information
General Information
Title
Time involvement and consumers' willingness to pay for information
RCT ID
AEARCTR-0002697
Initial registration date
February 14, 2018
Last updated
February 16, 2018 1:58 PM EST
Location(s)
Region
Primary Investigator
Affiliation
Cornell University
Other Primary Investigator(s)
PI Affiliation
Cornell University
Additional Trial Information
Status
In development
Start date
2018-02-25
End date
2018-02-26
Secondary IDs
Abstract
This experiment will help to get a better understanding of how people value information. The research is to test whether people will pay for useless information and what kind of internal and external factors push them to do so. In this context, useless information is information that cannot beneficially inform future decisions. This study identifies the impact of effort investment in decision-making process and consumers' willingness to pay for the information.
External Link(s)
Registration Citation
Citation
Gabrielyan, Gnel and David Just. 2018. "Time involvement and consumers' willingness to pay for information." AEA RCT Registry. February 16. https://doi.org/10.1257/rct.2697-1.0.
Former Citation
Gabrielyan, Gnel and David Just. 2018. "Time involvement and consumers' willingness to pay for information." AEA RCT Registry. February 16. https://www.socialscienceregistry.org/trials/2697/history/25893.
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2018-02-25
Intervention End Date
2018-02-26
Primary Outcomes
Primary Outcomes (end points)
The willingness to pay to know ex post whether they guessed correctly,

Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Indicators of effort
Secondary Outcomes (explanation)
Indicators of effort, include both the availability of detailed information about the horses and self reported use of variable information in making their guess
Experimental Design
Experimental Design
After choosing the winner at each session participants will revealing their willingness to pay. After all the sessions are done we will randomly select one of the sessions and play Becker-DeGroot-Marschak auction. In this auction, participants give a bid of a whole number, which represents the price they are willing to give in order to know the result of this round. The bid is compared to a price determined by a random number generator. If the their bid is greater than or equal to the price, he or she pays the price given by the number generator and will know the result at the end of session.
Experimental Design Details
After 20 sessions of guessing winners of horse races, we will inform participants of how much they have won based on one randomly selected race. After each selection is made, we will use a Becker-DeGroot-Marschak auction to elicit willingness to pay to know the outcome of that individual race, with the information being provided only after all selections have been made. In this auction, participants give a bid of a whole number, which represents the price they are willing to give in order to know the result of this round. The bid is compared to a price determined by a random number generator. If their bid is greater than or equal to the price, he or she pays the price given by the number generator and will know the result at the end of session. Participants will be placed in two treatments. The first will provide only information about the number and name of the horse. The second will additionally provide information on the name and the age of the horse, jockeys' names and their weight, trainers' and owners' names, and the win odds.
Randomization Method
Treatment randomization is done at each session by Qualtrics (survey software).
Randomization Unit
Randomization will be done for each session for each individual.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
150 participants with 20 sessions each
Sample size: planned number of observations
3000
Sample size (or number of clusters) by treatment arms
1500 each treatment (assuming uniform distribution)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
This study has 80.0% power to detect an effect size of E = S * E/S = 0.027. Given the following information alpha - 0.5 beta - .2 St.Dev - 0.26 (based on pilot info) N1 - 1500 N2 - 1500
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Cornell University Institutional Review Board for Human Participants
IRB Approval Date
2018-02-06
IRB Approval Number
1412005193
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers