Evaluating personal data sharing choices

Last registered on December 23, 2022

Pre-Trial

Trial Information

General Information

Title
Evaluating personal data sharing choices
RCT ID
AEARCTR-0004005
Initial registration date
March 21, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
March 28, 2019, 6:53 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
December 23, 2022, 5:18 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
Cornell University

Other Primary Investigator(s)

Additional Trial Information

Status
On going
Start date
2019-03-21
End date
2023-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
A description of this study will be in fields that will not become public until the experiment has completed.
External Link(s)

Registration Citation

Citation
Wu, Joy. 2022. "Evaluating personal data sharing choices." AEA RCT Registry. December 23. https://doi.org/10.1257/rct.4005-5.0
Former Citation
Wu, Joy. 2022. "Evaluating personal data sharing choices." AEA RCT Registry. December 23. https://www.socialscienceregistry.org/trials/4005/history/166801
Experimental Details

Interventions

Intervention(s)
A description of this study will be in fields that will not become public until the experiment has completed.
Intervention Start Date
2019-03-21
Intervention End Date
2020-06-13

Primary Outcomes

Primary Outcomes (end points)
Primary outcome is described in fields that are hidden until experiment has completed.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary outcomes are described in fields that are hidden until experiment has completed.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
A description of this study will be in fields that will not become public until the experiment has completed.
Experimental Design Details
Participants are recruited for an online study through the aforementioned Cornell University IRB approved participant recruitment system. The Lab pre-screens participants to verify their eligibility. Time slots for participants are open approximately 4-9 days before the survey is active. The lab administrator also sends an advertisement for available studies on Mondays. Approximately 3 days before the survey's participation deadline, the experimenter sends personal links for the survey to each registered participant's email. All participants read an IRB reviewed and approved online consent form; they are required to actively consent to the form and verify that they are over 18 years of age.

At the beginning of the survey, participants answer 50 questions taken from the open source International Personality Item Pool (IPIP), which are widely used to administer personality tests based on the Five-Factor Model. Participants are scored across five traits: Extraversion, Agreeableness, Conscientiousness, Emotional Stability, and Intellect. The survey generates these scores based on participants’ responses to the 50 questions and displays it to the participant in a table along with their first and last name (collected by the Lab in pre-screenings). A description of how to interpret these personality scores is also provided to the participant.

There are four conditions each participant experiences (call them T1, T2, T3, and T4). Each condition provides a different possible outcome participants face, should they choose to share their data. Each includes different information about how their data will be shared and/or what their recipient(s) will do with that data. In each scenario, there are 5 different possible prices offered for sharing data; and participants reveal whether they would accept or not accept each price for releasing their data. One of the four scenarios is selected to be made real for the participant. The survey then randomly assigns participants to one of four condition order groups. In the first group, participants experience condition order T1, T2, T3, and T4, sequentially. In a second, participants experience T2, T1, T4, and T3. In the third, participants experience T3, T4, T1, and T2. And in the last group, participants experience T4, T3, T2, and T1. The primary outcomes of interest are the revealed choices for each price (in each scenario) and how information treatments influence those decisions. In a between-subject analysis of subjects’ first decision, the second order group will be the control group. For a within subject analysis, condition T2 will be the control condition. How condition order interacts with each condition will also be analyzed support the interpretation of findings.

Exit survey includes a suite of questions about demographics, social media activity, data privacy concerns, and acceptability of various data privacy scenarios. Participants also enter a lottery to earn additional bonus money if they complete the exit survey questions. Data on time spent per question is also recorded by the survey. The experiment intends to avoid all deception, so all participant choices to share personal data and earnings outcomes are for real. Approximately one week after the participation deadline, personal data is released to randomly selected participants in an email (under the respective scenario requirements), and the Lab administrator electronically delivers Amazon gift cards with total earnings.
Randomization Method
Done by online survey tool and in the office by a computer
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Approximately 1000 individuals.
Sample size: planned number of observations
Approximately 1000 individuals.
Sample size (or number of clusters) by treatment arms
Equal number of individuals in each treated and control groups
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Cornell University Institutional Review Board for Human Participants
IRB Approval Date
2018-08-27
IRB Approval Number
1806008035

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials