Understanding the Importance of Summer Learning in a Pandemic

Last registered on September 03, 2020


Trial Information

General Information

Understanding the Importance of Summer Learning in a Pandemic
Initial registration date
September 01, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 02, 2020, 10:37 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
September 03, 2020, 6:16 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.


There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

University of Notre Dame

Other Primary Investigator(s)

PI Affiliation
Vanderbilt University
PI Affiliation
University of Notre Dame
PI Affiliation
University of Notre Dame
PI Affiliation
University of Notre Dame

Additional Trial Information

On going
Start date
End date
Secondary IDs
This study seeks to better understand the role of individualized, virtual summer instruction in redressing learning losses resulting from the COVID-19 school closures. In particular, we are interested in the potential for virtual learning to combat the negative effects of time out of school on achievement gaps. Given the mixed evidence on virtual learning platforms, especially for disadvantaged populations, the prospect of repeated school closures in the upcoming school year points to an urgent need to identify optimal strategies for online education. Our research design is based on a randomized-control trial evaluation of a unique summer program in NYC which will be offered free of charge to public school students in grades 3-8. All study students will have access to the virtual study materials, and treatment group students will additionally receive synchronous virtual instruction and comprehensive supports. Online assessments will be conducted in math and reading for all students in the study, providing not only a measure of program impact but also a baseline description of students’ skill levels following a shortened school year.
External Link(s)

Registration Citation

Brough, Rebecca et al. 2020. "Understanding the Importance of Summer Learning in a Pandemic." AEA RCT Registry. September 03. https://doi.org/10.1257/rct.6392
Experimental Details


We are working with Practice Makes Perfect (PMP), an organization in New York City that is about to launch a guided online summer learning program for up to 5,000 mostly low-income public school students. This 8-week program will consist of synchronous, interactive classes in math and reading, including one-on-one time between students and teachers, and will be offered at no cost to the students. Given the extraordinary academic circumstances students faced this past spring, we expect there will be great interest in this summer program. This program will be evaluated using a randomized controlled trial. Eligible and interested students will be randomly assigned to one of two treatment arms. The first arm will receive the full program, and the second arm will receive a light-touch version of the program: access to the same set of online materials used by the treatment group, but without a dedicated instructor or the format of the guided lessons. All fieldwork will be conducted remotely, including recruitment, online learning, surveys, and skill assessments.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Our main outcome of interest is student academic progress over the 8-week course. All study participants
will complete online assessments in math and reading at the beginning and end of the program. While we
also plan to collect MAP (Measures of Academic Progress) scores from administrative records, the
ongoing crisis may prevent schools from conducting these tests in Fall 2020. In light of this risk, a high
completion rate of the skills assessments by students in both experimental arms is critical. We plan to use
both monetary incentives and non-monetary nudges to encourage completion of the assessments. Non-
monetary incentives will include verbal confirmation and congratulations from the instructor, and visual
cues to track completion within the virtual program.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The research team is interested in understanding how best to offer education services to students via. a virtual setting. The COVID crisis has demonstrated a need for society to understand how best to offer educational services virtually. The summer months already pose a unique equity challenge as students of low-socio-economic backgrounds are at greater risk of experiencing learning loss, as compared to other groups. This, in combination with the varied type of academic support students across NYC received in the spring semester of 2020, demonstrate a need to understand how best to administer summer learning programs.

This study will operate as a randomized controlled trial. Students interested in participating in the summer program will be randomly assigned to one of two groups. The treatment group will receive access to online platforms as well as to mentors and teachers through virtual platforms. The control group will have access to online platforms, but not to live-virtual sessions with teachers or mentors.
Experimental Design Details
Not available
Randomization Method
This study will use STATA programming to randomize students. Randomization will be stratified by students grade and zip code.

Prior to randomization, ordered lists were created for each zip-code grade pair. Each ordered list was split into blocks of 6. Each block of 6 was then randomly ordered using a randomly generated number between 0 and 1. The 3 lowest-ranking values of each 6-block group was assigned to the treatment group, the remaining 3 were assigned to the control group.

On June 22nd, new lists were created to reflect a 70-30 Treatment-Control ratio.
Randomization Unit
Randomization occurs at the student level. Randomization is stratified by student grade and zip-code and randomized in blocks of 6 (ensuring that 3/6 students were assigned to the treatment group).

Starting June 22, students were randomized using a 70-30 ratio (70% assigned to treatment group; 30% assigned to control).
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
Approximately 11,000 students.
Sample size (or number of clusters) by treatment arms
We have successfully enrolled 10,839 students into the study with 6303 in the treatment group and 4536 in the control group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
In calculating sample requirements for the study, we consider four different levels of take-up rates. In the summer of 2018, PMP observed a take-up rate of approximately 25%. However, the recruitment method during the summer of 2018 differs greatly from what it is being implemented this summer. Previously, partnering schools would recommend students to the PMP program. PMP would then contact those students to enroll in the program; this method yielded a 25% take-up rate. In contrast to previous years, this summer PMP is requiring that families complete an interest survey before being considered for a place in the PMP program. This additional screening mechanism should increase our observed take-up (as a proportion of families who complete the survey), and therefore, for purposes of this analysis, we consider 25% a lower-bound for our projected take-up. We expect this summer’s take-up rate to fall between 50% and 75%. Power calculations of these scenarios suggest that with our target sample size of 10,000, we will be able to observe an increase of between 0.075 and 0.112 standard deviations for education testing outcomes. 5 An impact of at least this magnitude is reasonable to expect from a successful summer learning program. For comparison, Cooper et al. (1996) estimate that the average effect of remedial summer programs is about 0.2 standard deviations. Other well-known evaluations of summer school programs include the Teach Baltimore Summer Summer Academy (Borman and Dowling, 2006) and BELL (Chaplin and Capizzano, 2006); both studies indicate that one summer of active participation improved reading by roughly 0.15 standard deviations. It should be noted that these comparison programs are in-person programs that are arguably more intensive than the one being implemented by PMP.
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Institutional Review Boards (IRBs)

IRB Name
University of Notre Dame Research Compliance
IRB Approval Date
IRB Approval Number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information