Improving Online Learning through Course Design: A Microeconomic Approach

Last registered on January 11, 2022


Trial Information

General Information

Improving Online Learning through Course Design: A Microeconomic Approach
Initial registration date
January 09, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 11, 2022, 9:00 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

University of Stavanger Business School

Other Primary Investigator(s)

Additional Trial Information

On going
Start date
End date
Secondary IDs
Prior work
This trial is based on or builds upon one or more prior RCTs.
Online education has expanded dramatically over the past two decades, yet significant learning challenges remain. In light of these, my paper provides the first microeconomic analysis to examine how the quality of online university courses can be enhanced through course design, addressing the twin needs of providing individualized support to students and keeping them engaged with online coursework.
First, I gather rich data on undergraduates at a large public university taking an online introductory programming course that has a cumulative structure. The data allow me to monitor students' study time precisely and to characterize important dimensions of heterogeneity: student attentiveness and whether they are forward-looking. I then conduct two randomized interventions that nudge students to utilize an online discussion board more fully and to complete online assignments. I then develop and estimate a behavioural model of student effort supply, credibly identifying the marginal benefits and costs of effort at each stage of the cumulative learning process using the two field experiments. The estimated model allows me to explore the efficacy of changing assignment grading weights to improve student learning. In contrast to the actual (equally-weighted) grading scheme, simulated weights that maximize learning are decreasing across assignments, serving to increase effort by myopic students early in the course when they acquire foundational skills. My course-design approach is applicable more generally in other online and traditional course settings.
External Link(s)

Registration Citation

Shaikh, Hammad. 2022. "Improving Online Learning through Course Design: A Microeconomic Approach." AEA RCT Registry. January 11.
Experimental Details


The interventions considered in this study can be categorized as `targeted informational reminders' as their design includes the following elements: 1) they prompt students to take a specific action (e.g., registering for the discussion board), 2) they provide information on how to clearly execute the action (e.g., instructions to sign-up for discussion board), and 3) they serve as a reminder for the specified task (e.g., course uses a discussion board).

Intervention 1: discussion board sign-up activity

The sign-up activity is designed to promote discussion board registration. The activity is composed of the following elements: 1) it presents a link to the discussion board sign up page, 2) it uses screen shots to illustrate key steps for sign up, 3) it summarizes all steps into an animated GIF, and 4) students are disclosed information about the discussion board. The online activity consists of two pages: the first page contains the instructions, and the second includes information about the discussion board. The informational page discusses the functionality of the discussion board, and also discloses the proportion of students' questions that have been answered by either a peer or the instructor. Students in the treatment group are randomly assigned to receive either sign-up instructions only (i.e., page 1), discussion board information (i.e., page 2), or both.

Intervention 2: homework reminder message

The reminder messages are aimed at promoting students to further participate in their weekly low-stakes homework. Reminders are only sent for the graded homework assessments after week 2. The homework reminder is composed of the following three elements: 1) reminding students of the upcoming homework deadline, 2) prompting them to set aside time in their schedule to next make progress on the homework, and 3) including a direct link to the homework assessment.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Final exam grade
Primary Outcomes (explanation)
The final exam is the most comprehensive measure of learning in the course.

Secondary Outcomes

Secondary Outcomes (end points)
Weekly homework grade
Secondary Outcomes (explanation)
Weekly homework grades measure the accumulation of knowledge throughout each stage of the course.

Experimental Design

Experimental Design
Intervention 1: discussion board sign-up activity
For all the students who completed the baseline survey and did not register for the discussion board within the first week of the course, half of them received the sign-up activity.

Intervention 2:
For students who had not completed the homework before the deployment of the reminder message, half of them are randomly assigned to receive a homework reminder. The reminder messages were sent throughout the course and were re-randomized with each deployment. Consequently, the number of total homework reminders a student receives follows a binomial distribution with 10 trials and a 0.5 probability of success.

Experimental Design Details
Intervention 3: self-reflection on responsiveness to assignment weights
Randomize students at the end of course survey to a hypothetical scenario where three homework assignments equally spaced across the programming course have weights 1) 10%, 10%, 10% (constant), 2) 5%, 10%, 15% (increasing), and 3) 15%, 10%, 5% (decreasing). Record student responses to how they would allocate 6-hours of total study time across the three assignments. I will also randomize whether students explicitly told whether the assignments have a cumulative structure.
Randomization Method
The study followed a double-blind protocol for implementing the randomized interventions. That is, students were not informed of their treatment status but were aware that a study was being conducted for the purposes of improving course design. Students were randomized using complete randomization. That is, the anonymized list of consenting students was each assigned a random number generated from U[0,1] using Stata. Then students were ranked according to the random draw, the top 50% are in the control, and bottom 50% are in the treatment.
Randomization Unit
The randomization is carried out at the student level within the week the intervention is deployed.
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Around 4500 students.
Sample size: planned number of observations
Around 4500 students.
Sample size (or number of clusters) by treatment arms
Around 2250 students in the control group, and 2250 students in the treatment group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
University of Toronto (U of T) Research Ethics Boards (REBs)
IRB Approval Date
IRB Approval Number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials