x

We are happy to announce that all trial registrations will now be issued DOIs (digital object identifiers). For more information, see here.
Evaluating Longshot Preference in the Field
Last registered on October 18, 2019

Pre-Trial

Trial Information
General Information
Title
Evaluating Longshot Preference in the Field
RCT ID
AEARCTR-0004790
Initial registration date
September 28, 2019
Last updated
October 18, 2019 3:06 AM EDT
Location(s)
Region
Primary Investigator
Affiliation
National University of Singapore
Other Primary Investigator(s)
Additional Trial Information
Status
Completed
Start date
2019-10-01
End date
2019-10-16
Secondary IDs
Abstract
We collaborate with a support team in an Asian University, which provides information technology services to the university community. The support team aims at motivating participation in a crossword game, through which users will master the use of some self-services provided by them. To rise participation rate, each participant will receive a lottery as reward. We design six lotteries and randomly divide the users into six groups. The lottery incentive varies among groups. By comparing the response rate in each group, we can evaluate the longshot preference under the setting of motivating behavior.
External Link(s)
Registration Citation
Citation
Zhong, Songfa. 2019. "Evaluating Longshot Preference in the Field." AEA RCT Registry. October 18. https://doi.org/10.1257/rct.4790-1.1.
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2019-10-01
Intervention End Date
2019-10-16
Primary Outcomes
Primary Outcomes (end points)
The participation rates in six groups.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
We design six lotteries, which differ in: (1) sources of uncertainty (familiar vs. unfamiliar); (2) skewness of probability; (3) number of prizes; (4) method of probability realization (“randomly draw one out of 100 participants” vs. “bet on the last two digits of some indexes”); (5) vagueness of information (“lucky draw” vs. “win with probability 1%”). We randomly divide all users in six groups. Each group receives one lottery as reward for participation.
Experimental Design Details
Randomization Method
The randomization is done by a computer. For each subject, we generate a random number following standard normal distribution using STATA. After that, we sort the sample according to the value of this random variable. The first one sixth of subjects consist of our first group, and so on.
Randomization Unit
Individual.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
This study has no cluster.
Sample size: planned number of observations
60,000 people in university, including staff and students.
Sample size (or number of clusters) by treatment arms
10,000 people
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
NUS Institutional Review Board
IRB Approval Date
2018-08-06
IRB Approval Number
S-18-181E
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers