x

We are happy to announce that all trial registrations will now be issued DOIs (digital object identifiers). For more information, see here.
Increasing Personal Data Contributions: Field Experimental Evidence from an Online Education Platform
Last registered on September 09, 2019

Pre-Trial

Trial Information
General Information
Title
Increasing Personal Data Contributions: Field Experimental Evidence from an Online Education Platform
RCT ID
AEARCTR-0004604
Initial registration date
September 07, 2019
Last updated
September 09, 2019 9:59 AM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
University of Cologne
Other Primary Investigator(s)
PI Affiliation
University of Potsdam
Additional Trial Information
Status
In development
Start date
2019-09-09
End date
2020-03-31
Secondary IDs
Abstract
In this project, we study a new type of contribution to a (impure) public good, personal data. We do so in the context of a German online education platform, which provides free online courses for participants but lacks sufficient data about them to further improve their services. We design soft behavioral interventions that highlight higher benefits and lower costs of contribution. In one treatment, we increase the salience of the public benefit of personal data contributions for the whole user community. In the second treatment, we additionally highlight data protection standards, thereby reducing potentially overestimated privacy costs. If our intervention is effective, we plan to use machine learning methods to study whether our intervention indeed increases the quality of the public online education good. For example, we may explore whether a larger base of personal data generates more precise predictions of course-related outcomes such as drop-out and grades.
External Link(s)
Registration Citation
Citation
Ackfeld, Viola and Sylvi Rzepka. 2019. "Increasing Personal Data Contributions: Field Experimental Evidence from an Online Education Platform." AEA RCT Registry. September 09. https://doi.org/10.1257/rct.4604-1.0.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
In treatment 1, we increase the salience of the public benefit of personal data contributions for the whole user community. In treatment 2, we additionally highlight data protection standards, thereby reducing potentially overestimated privacy costs.
Intervention Start Date
2019-09-09
Intervention End Date
2019-12-20
Primary Outcomes
Primary Outcomes (end points)
a) Intention to provide data: Clicked on link to profile in pop-up
b) Extensive margin: User has any entry in profile (yes/no)
c) Intensive margin: Number of entries filled
Primary Outcomes (explanation)
a) the click is recorded if forwarding page to profile is visited (0 = "profile not visited", 1 = "profile visited")
b) coded as dummy variable (0 = "no profile entries", 1 = ">0 profile entries")
c) count how many profile entries are filled. Categories: date of birth, affiliation, career status, highest degree, background in IT, professional life, position, city, gender, country. New categories: motivation, computer use at work
Secondary Outcomes
Secondary Outcomes (end points)
d) Intensive margin by privacy-sensitivity: Number of entries filled separately for privacy sensitive / insensitive categories
e) Directional intensive margin: Number of entries reduced / added separately
f) Intensive margin: Updates of profile categories that were filled previously
Secondary Outcomes (explanation)
d) Entries are rated by treatment-blind student assistants on a scale from 1 = "not at all sensitive" to 7 = "completely sensitive". We conduct a mean split based on the average rating by the student assistants.
e) as c) but additionally separate whether change is an extension or reduction of profile content
f) Dummy = 1 if entry is still filled but with different content than before (we expect only a few chnages to occur here, but information may be relevant, e.g., for updating the professional status)
Experimental Design
Experimental Design
Our treatments are implemented as simple pop-up messages embedded in the online learning platform in the second course week. We collect pre- and post-intervention profile data 5-6 days after course start (before course week 2) and 21-22 days after course start (or as close to theses dates as possible if days are not workdays), respectively. We experimentally vary the text prompting participants to review their user profile. In the first treatment, we emphasize the public benefit of sharing personal data. In the second treatment, we combine this treatment with emphasizing data protection standards.
Experimental Design Details
Not available
Randomization Method
by computer (via infrastructure of the platform)
Randomization Unit
Randomization into treatments takes place at the beginning of the second course week of each course based on platform user IDs. Using platform-wide user IDs avoids assigning participants into one treatment group in one course and to another in another course.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
N/A
Sample size: planned number of observations
All participants who are active in the second course week and/ or third week are part of our experimental sample. Due to technical reason, we only consider users who access the course by computer. It is not yet clear how many participants each course attracts. Over all courses, we will aim at reaching at least 13000 participants in total.
Sample size (or number of clusters) by treatment arms
We divide this sample evenly into one control and two treatment groups, i.e., at enrollment we aim at having at least 4300 participants in each group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
With at least 4300 participants at enrollment in each group, we have enough power to identify a 10% effect size at the extensive margin and a 5% effect at the intensive margin (number of profile categories filled out).
Supporting Documents and Materials

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Ethics Review Board (ERB) of the Department of Management, Economics, and Social Sciences at the University of Cologne
IRB Approval Date
2019-09-04
IRB Approval Number
19023VA
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information