Self-control, naivete and completion rate for MOOCs: a field experiment

Last registered on January 05, 2020

Pre-Trial

Trial Information

General Information

Title
Self-control, naivete and completion rate for MOOCs: a field experiment
RCT ID
AEARCTR-0005144
Initial registration date
December 28, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 05, 2020, 11:30 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Humboldt University of Berlin, WZB Berlin Social Science Center

Other Primary Investigator(s)

PI Affiliation
Department of Economics, University of Pittsburgh
PI Affiliation
School of Economics and Management, Tsinghua University

Additional Trial Information

Status
In development
Start date
2019-12-30
End date
2020-01-31
Secondary IDs
Abstract
The low completion rates for Massive Open Online Courses (MOOCs) has long been a controversial topic. While some see the low rates of completion as a sign of inefficiency, others argue that completion rate is a misleading index as MOOCs give students the freedom to learn what they want without finishing the course. However, the latter argument only makes sense if students do achieve their personalized learning goals on MOOCs. This project aims to answer two sets of questions: first, does self-control play a role in explaining the low completion rates for MOOCs? If the answer is yes, it means the low rates of completion is indeed a problem. Then the next question is can we design some interventions to raise the rate of completion? In particular, we plan to design personalized commitment devices based on students' present biases. One design choice we need to make is whether to inform them that their answers to the time preference elicitation questions will be used to calculate the optimal commitment devices for them. If they know in advance that the elicited present bias will affect the commitment contracts they later get, then they have incentives to game the system. However, whether they will be truth-telling or not when answering the time preferences questions is an empirical question and we plan to test that before we implement the interventions. The basic idea is to randomly select half of the subjects to inform them the personalized interventions in the future before they answer the questions and leave the other half uninformed. Then we can test whether their elicited time preferences/present biases are drawn from the same distribution or not. If we cannot reject the null hypothesis, then we can conclude that it is safe to inform the subjects in advance that the commitment devices they will get is contingent on their elicited time preferences.
External Link(s)

Registration Citation

Citation
Liu, Xiao , Yiming Liu and Stephanie Wang. 2020. "Self-control, naivete and completion rate for MOOCs: a field experiment." AEA RCT Registry. January 05. https://doi.org/10.1257/rct.5144-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2019-12-30
Intervention End Date
2020-01-31

Primary Outcomes

Primary Outcomes (end points)
present bias; measured with both money (incentivized) and effort (hypothetical)
Primary Outcomes (explanation)
beta as in the (beta, delta) model. We will use both money and effort to measure beta, the parameter for present bias. In the money case, we adopt Falk et. al. (2018)'s multiple price list measure for time preference with two modifications. First, our questions are incentivized. We randomly select 1% of subjects to pay. Second, we ask the same questions twice to elicit present bias. On day 1, we ask them to choose between less money 1 week later versus more money 1 year plus 1 week later. On day 8, a week after day 1, we ask them again to choose between less money today versus more money 1 year later. "Today" on day 8 is equivalent to "1 week later" on day 1. All the parameters are consistent with the ones in Falk et. al. (2018)'s China survey. We also elicit time preferences with convex time budget using hypothetical effort. In those questions, we ask subjects to allocate course video watching hours between an earlier date and a later date with a fixed total number of hours and varying exchange rates between the two days. The payment for finishing this task is fixed but hypothetical. Consistent with the monetary measure, subjects also need to make two sets of choices, advance choices and immediate choices. We adopt Augenblick et. al. (2015)'s parameters in the hypothetical effort questions.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
There is a treatment group and a control group. Subjects in the treatment group receive a message before they submit their answers to the question we use to elicit their time preferences. In the message we inform them that the following questions will elicit their preferences and those elicited preferences will be used to calculate the optimal personalized interventions to enhance their performances on the MOOCs platform. We are vague about the specific preferences the following questions will elicit, how their answers to the questions will be translated into measures of their preferences, and how the measured preferences will affect the interventions they will get.
Experimental Design Details
Randomization Method
simple randomization with a pseudo-random random number generator.
Randomization Unit
individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
N/A
Sample size: planned number of observations
2000
Sample size (or number of clusters) by treatment arms
1000 in each treatment, 2 treatments in total
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials