x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Supporting Student Engagement with Remote Learning: A randomized controlled trial in Chicago Public Schools
Last registered on September 18, 2020

Pre-Trial

Trial Information
General Information
Title
Supporting Student Engagement with Remote Learning: A randomized controlled trial in Chicago Public Schools
RCT ID
AEARCTR-0006250
Initial registration date
September 17, 2020
Last updated
September 18, 2020 2:29 PM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
University of Chicago
Other Primary Investigator(s)
PI Affiliation
University of Chicago
PI Affiliation
Northwestern University
PI Affiliation
University of Toronto
Additional Trial Information
Status
On going
Start date
2020-07-06
End date
2021-06-01
Secondary IDs
Abstract
With the disruption in classroom learning caused by the COVID-19 pandemic, many students may be falling behind in their learning trajectory and less prepared to begin their next grade in the fall. Some researchers are predicting a 30-50% loss in learning due to the COVID-19 pandemic across subjects and up to a year’s worth of learning loss in mathematics, and there is evidence to suggest that the pandemic will reify and exacerbate existing systemic inequalities. Most U.S. school districts have turned to remote learning to mitigate this learning loss, however, they face many challenges to ensuring that high quality content is delivered and utilized by students, including technical access, curriculum building, and student engagement. We are partnering with Chicago Public Schools (CPS) to conduct three large-scale randomized controlled trials to try to understand what works to engage and support students in a virtual summer learning program.

In the first RCT, we will randomize students eligible for the Summer Learning program who do not register by CPS’ deadline into intensive outreach treatment to learn how we can better encourage families to sign up for this program. In the second and third RCTs, we study how we can best increase student engagement with remote learning and the online Khan Academy platform offered by the Summer Learning program by randomizing students to receiving different forms of learning and engagement support. We hope the three studies will inform future planning and implementation of remote learning efforts in the district in the coming school year.
External Link(s)
Registration Citation
Citation
Bhatt, Monica et al. 2020. "Supporting Student Engagement with Remote Learning: A randomized controlled trial in Chicago Public Schools." AEA RCT Registry. September 18. https://doi.org/10.1257/rct.6250-1.0.
Experimental Details
Interventions
Intervention(s)
BACKGROUND

With the disruption in classroom learning caused by the COVID-19 pandemic, many students may be falling behind in their learning trajectory and less prepared to begin their next grade in the fall. Some researchers are predicting a 30-50% loss in learning due to the COVID-19 pandemic across subjects and up to a year’s worth of learning loss in mathematics, and there is evidence to suggest that the pandemic will reify and exacerbate existing systemic inequalities. Since in many ways the only mode of delivery will be through remote learning, any proposals to mitigate this learning loss must involve consideration of access to hardware and connectivity. However, having sufficient technical access still does not solve perhaps the most important challenge, which is ensuring that high quality content is delivered and utilized by students.

The Chicago Public Schools (CPS) Summer Learning program is one proposal to address the remote learning needs of students and successfully prepare them for SY2021.

The CPS Summer Learning program offers students in grades 1-8 who struggled to engage with virtual learning in Spring 2020 with an important opportunity to re-engage with the material during the Summer and, ultimately, successfully transition to the next grade level in SY2021. CPS students are eligible for the Summer Learning program if they received an ‘Incomplete’ in Math, Reading, or both during the Spring 2020 academic semester. As part of the Summer Learning Program, CPS is utilizing the online Khan Academy platform, a free, computer-assisted learning (CAL) programs available to teachers, students, and parents to support students in mathematics. The program is designed to save teachers’ time, lower education costs, and offer a more customizable learning experience to students. The goal is to facilitate interactive learning while allowing students to progress at their own pace. Such a readily available, no-cost platform makes it a promising tool for CPS to scale district-wide.

However, for students to benefit from the Summer Learning program and its curriculum, they first must be equipped with the proper information, technology, and face-to-face support to not only register but successfully participate in this virtual program.

INTERVENTIONS

We focus on three primary questions for this program evaluation:

1. Do personal outreach efforts to families induce registration for Summer Learning?
2. Do text message reminders increase attendance in Summer Learning?
3. Do personalized supports such as attendance monitoring, outreach to families, and individualized tutoring sessions increase attendance and performance in Summer Learning?

In addition, we are interested in the following secondary research questions:
4. What is the synergistic effect, if any, of getting both the text message reminders and the personalized supports during the Summer Learning program?
5. For this population of students, do additional supports during a four-week summer learning program influence their engagement with online learning in the following school year?

We plan to answer these questions by conducting three large-scale Randomized Control Trials (RCTs).

In the first RCT (referred to as Experiment 1 hereafter), we will randomize students eligible for the Summer Learning program who do not register by CPS’ registration deadline to two treatment arms: outreach worker treatment and control group. This effort will provide CPS with the opportunity to better understand whether they can improve program take-up through personal outreach. For students randomly assigned to the outreach worker treatment, outreach workers will reach out to students’ families using existing CPS contact information during the two weeks prior to the Summer Learning program. During such outreach, outreach workers will inform families of the Summer Learning program of their child’s eligibility to participate in the program; answer any questions they may have about the program; collect information on student barriers to virtual learning and/or any hesitations for registering for the program; and assist them with filling out the registration form. The control group will consist of randomly assigned students who will not be reached out to by outreach workers after the registration deadline is passed.

In the second RCT (referred to as Experiment 2 hereafter), which takes place while Summer Learning program is in session, we will randomize students who have registered for the Summer Learning program and consented to receive text messages into two treatment arms: text message reminders and control group. The text message reminders will be sent to parents to remind them of Summer Learning class times and to encourage them to have their child join their Summer Learning sessions.

In the third RCT (referred to as Experiment 3 hereafter), which also takes place during the Summer Learning program, we will randomize Summer Learning classrooms into two treatment arms: resident teacher support and control group. Resident teachers, who are teachers-in-training employed by CPS, will be randomly assigned to Summer Learning classrooms and will provide support to lead classroom teachers to increase student engagement and attendance at daily Summer Learning sessions; act as a mentor/role model to students and develop positive relationships with students in their assigned class(es); and support students to improve the use and efficacy of their online math learning platform – Khan Academy.

Experiment 2 and Experiment 3 start after completion of Experiment 1 and set up a 2x2 randomization scheme where students may receive the Resident Teacher treatment only, the texting treatment only, the Resident Teacher and texting treatment, or no treatment (control).
Intervention Start Date
2020-07-06
Intervention End Date
2020-08-14
Primary Outcomes
Primary Outcomes (end points)
Our primary outcome variable for the registration experiment (Experiment 1) is registration in the Summer Learning program, which will be provided by CPS using their Aspen student information system.

Our primary outcomes for Experiment 2 and 3 include:
- Attendance during the Summer Learning program
- Engagement with Khan Academy during the Summer Learning program
Primary Outcomes (explanation)
Experiment 2 and 3:
- Attendance during the Summer Learning program: measured by number of Summer Learning program days attended
- Engagement with Khan Academy: Time “learning” as recorded by the Khan Academy platform
Secondary Outcomes
Secondary Outcomes (end points)
Our secondary outcomes for Experiments 2 and 3 include:
- Other measures of engagement in Summer Learning, including amount of time logged into Google Classroom, Summer Learning course completion, and end-of-program assessment scores, if available
- Other measures of engagement with Khan Academy, including time spent on the Khan platform, number of problems attempted on Khan, and % mastered in the corresponding ‘getting ready for grade X’ course for students for whom Khan was used as their math online platform (grades 3-8), if available
- Engagement with other online learning platforms such as ST Math and Amplify used by CPS during the summer learning program and the following school year (if available).
- Academic outcomes during the following year as measured by: standardized math and reading test scores; math and reading course grades, overall GPA and Math GPA, course completion, and engagement with Khan during the following school year (SY2021).
- Engagement with remote learning during the following school year
Secondary Outcomes (explanation)
Experiments 2 and 3:
- Engagement with ST Math and Amplify will be measured by (if available): time spent on the platform, number of logins, and progress on courses/mastery goals.
- Engagement with the Khan platform will be measured by: time spent on the Khan platform on math, number of problems attempted on Khan, % mastered in the corresponding ‘getting ready for grade X’ course (if available).
- Math GPA is measured by grades in core Mathematics courses only
- Engagement with remote learning during the following school year will be measured by: number of days attended, amount of time logged in on google Classroom
Experimental Design
Experimental Design
Experiment 1: We will evaluate the impact of outreach workers on Summer Learning registration using a randomized controlled trial. Students who were eligible for CPS’ Summer Learning program were randomized into either the outreach worker treatment group or the control group. Each of the 20 outreach workers received a random list of 400 students and proceeded to call parents working down their lists over the course of one week.

Experiment 2: We will evaluate the impact of text message reminders on Summer Learning participation and academic outcomes. Students whose parents registered them for Summer Learning and consented to receive text messages from CPS were randomized into either the text message reminder treatment or the control group. 3,804 students were randomized into the treatment group and 3,804 were randomized into control.

Experiment 3: We will evaluate the impact of classroom support in the form of a Resident Teacher on student Summer Learning participation and academic outcomes. We randomized Resident Teacher support at the classroom level. Out of 461 Summer Learning classrooms, 120 were randomly assigned Resident Teacher support.

Experiment 2 and Experiment 3 set up a 2x2 randomization scheme where students may receive the Resident Teacher treatment only, the texting treatment only, the Resident Teacher and texting treatment, or no treatment (control).
Experimental Design Details
Not available
Randomization Method
Experiment 1: Students are randomized at the individual level using Stata. We first remove two groups of students from our sample: those students who are already registered at the time of randomization those for whom we do not have a phone number listed in our data. Next, we randomly order and select 8,000 students from this sample to be in the treatment group and assign all the remaining students in our sample to the control group. We then check balance across treatment and control groups on the following baseline covariates: race, age, gender, English Language Learner status, students in temporary living situations status, learning disability, attendance days, member days, whether the student is eligible for free/reduced lunch, GPA, math and reading course failures, standardized math score, standardized reading score, and grade. We randomly sort the 8,000 treatment students into 20 individual lists for outreach workers, and check balance on those same variables across the 20 lists.
Experiment 2: Students in Summer Learning were randomly assigned to the texting treatment at the individual level using Stata. We use CPS’s registration responses from the form they sent out to families to do this. We set a consistent seed, remove those students who are ineligible for Summer Learning (not in grades 1-8), remove those students without phone numbers available, and only keep those students who consented to be part of the study. We randomly assign 3,804 students (half of those who met the listed criteria) to the texting treatment.
Experiment 3: The randomization for Experiment 3 is at the classroom level. Resident teachers are randomly assigned to Summer Learning classrooms using Stata. We randomize classrooms proportionally by grade band (1-3 grade, 4-5 grade, 6-8 grade) and by instructional language in the classroom (English or Spanish). Proportionally refers to allocating Resident Teachers (RTs) based on how many of each type of classroom in each grade-level there are in total for the Summer Learning program. We randomize only Resident Teachers with Spanish-speaking abilities to classrooms that were listed as a Spanish-speaking classroom. We assigned each resident teacher to support two classrooms. Each 1-5th grade lead teacher was only assigned to one classroom since they would teach the same group of students both math and reading; while each 6-8th grade lead teacher was assigned to teach only one subject (either math or reading) to two classrooms, so a math teacher would teach to two different groups of students. Given this structure and to fulfill the goal of assigning each Resident Teacher to support two classrooms, in 1st to 5th grade, we paired each Resident Teachers to support two lead teachers (and therefore two classrooms), while in 6-8th grade, we paired each Resident Teachers to support one math lead teachers (and therefore provide support for both of their classrooms). Since a key component of the resident teacher treatment was to provide support with engaging students with the online math platform, we did not assign resident teacher support to 6-8th grade reading lead teachers and assigned them to only support math teachers.
Randomization Unit
Experiment 1: students are randomized at the student level for the registration experiment
Experiment 2: The text message nudges are randomized at the student level.
Experiment 3: The Resident Teacher treatment is randomized at the classroom level.
Was the treatment clustered?
Yes
Experiment Characteristics
Sample size: planned number of clusters
Experiments 1 and 2 were not clustered.
Experiment 3 was clustered at the classroom level, with 461 total classrooms.
Sample size: planned number of observations
Experiment 1: 14,567 total students Experiment 2: 7,608 total students Experiment 3: At the time of designing the experiment and writing up the pre-analysis plan, we do not have classroom roster data and therefore do not have any information on how students were distributed across classrooms/teachers. Consequently, we do not know the total number of observations for experiment 3.
Sample size (or number of clusters) by treatment arms
Experiment 1: 8,000 students outreach worker treatment; 6,567 control students
Experiment 2: 3,804 students receiving texts; 3,804 control students
Experiment 3: 120 treatment classrooms; 341 control classrooms.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
EXPERIMENT 1 We designed Experiment 1 before knowing the actual capacity at CPS for the Summer Learning program. There were roughly 23,000 students who were eligible for the program from the district. We make a series of assumptions in order to carry out our power calculation analysis under different scenarios. In the first scenario, we assume that not all eligible students will want to participate in the program and the district does not have the capacity to serve all eligible students, so we set program capacity to 11,000 students served in Summer Learning. Assuming there are 3,300 students already registered by the time we start experiment 1, and that the control group’s take-up rate is 50 percent, we calculate the following minimum detectable effect sizes (MDEs) for changes in registration for the program. We vary the number of students who are called by an outreach worker to be 3,500 or 7,000, and we set power = 0.8. The following table, which summarizes our findings suggests that if outreach workers called 3,500 students, we would be able to detect a 2.8 percentage point change in treatment student registration as compared to the control students. Alternatively, if outreach workers called 7,000 students, we would be able to detect a 2.4 percentage point difference in the probability to register for Summer Learning between the treatment and control groups. MDE (3500 students called, 8100 students in control): 0.0283227 MDE (7000 students called, 6350 students in control): 0.0242665 In the second scenario, we set program capacity to 23,000 students (all those who were eligible) served in Summer Learning and repeat the above exercise. We further assume that there are 8,000 students already registered by the time Experiment 1 starts, and that the control group take-up rate is 15 percent, in order to calculate the following MDEs. The outcome here is registering for the program, we vary the number of students who are called by an outreach worker to be 4,000, 2,000, or 1,000, and we set power = 0.8. The following table, which summarizes our findings suggests that if outreach workers called 4,000 students, we would be able to detect a 2.1 percentage point change in treatment student registration as compared to the control students. Alternatively, if outreach workers called 2,000 or 1,000 students, we would be able to detect a 2.6 and 3.5 percentage point differences in registration rate across the treatment and control groups, respectively. MDE (4,000 students called, 5500 in control): 0.0213398 MDE (2,000 students called, 6500 in control): 0.0262759 MDE (1,000 students called, 7000 students in control): 0.034877 EXPERIMENT 2 (texting experiment): We ran the following power calculations for the individual-level texting treatment where we assign half of the total 7,608 students in Summer Learning to the treatment group and the other half to the control group, so 3,804 students in each group. Since we did not have data from SY20 on which specific students were eligible for Summer Learning when calculating the Minimum Detectable Effect, we constructed two samples of students from SY18 that we assumed would look similar to those eligible for Summer Learning this year—those students with the most absent days in SY18, and those students with the most math and reading course failures in SY18. We look at these same students’ outcomes from the following school year, 2019. Absences refers to the number of days absent per school year. Math and reading course failures refers to the total number of math or reading related courses that a student failed in the fall quarter. Test scores are from the fall quarter math and reading assessments, and are standardized at the grade and subject level, by year. To interpret this: when using the mock sample of students with the most absent days in SY18, assuming 7,608 students register for Summer Learning, with 3,804 treatment students and 3,804 control students, where power = 0.8, we are able to detect a change of 1.48 days in the number of absent days; a change of 0.03 in the number of courses failed; a change of 0.06 standard deviations in standardized math score; and a change of 0.07 standard deviations in standardized reading score. When using the mock sample of students with the most math and reading course failures in SY18, with the same assumptions, we are able to detect a change of 1.06 days in the number of absent days; a change of 0.03 in the number of courses failed; a change of 0.06 standard deviations in standardized math score; and a change of 0.06 standard deviations in standardized reading score. We also test whether we can detect a change in summer program failure rate, assuming the control group has a failure rate of 50%, and we find that we are able to detect a 3-percentage point change in summer program failure rate. MDE (for 7608 students, 3804 in treatment and 3804 in control): SY19 absences using most SY18 absences sample: 1.47966 SY19 course failures using most SY18 absences sample: 0.0303899 SY19 standardized math score using most SY18 absences sample: 0.0625444 SY19 standardized reading scores using most SY18 absences sample: 0.0655973 SY19 absences using most SY18 course failures sample: 1.055417 SY19 course failures using most SY18 course failures sample: 0.0349519 SY19 standardized math score using most SY18 course failures sample: 0.0613209 SY19 standardized reading score using most SY18 course failures sample: 0.0616284 Summer program failure rate, control rate 50 percent: 0.0320979 EXPERIMENT 3 (Resident Teacher Experiment): We assume that there are 120 treatment clusters and 341 control clusters, and 8,000 students in the program (the total number in the program, not only the students who consented to receive texts). At the time of designing the experiment and writing up the pre-analysis plan, we do not have classroom roster data and therefore do not have any information on how students were distributed across classrooms; we assume the number of students per classroom is the same for all. We calculate the intra-cluster correlation coefficient based on the SY18 year-school-grade for each SY19 outcome. Since we did not have data from SY20 on which specific students were eligible for Summer Learning when calculating the Minimum Detectable Effect, we constructed two samples of students from SY18 that we assumed would look similar to those eligible for Summer Learning this year—those students with the most absent days in SY18, and those students with the most math and reading course failures in SY18. We look at these same students’ outcomes from the following school year, 2019. Absences refers to the number of days absent per school year. Math and reading course failures refers to the total number of math or reading related courses that a student failed in the fall quarter. Test scores are from the fall quarter math and reading assessments, and are standardized at the grade and subject level, by year. To interpret this: when using the mock sample of students with the most absent days in SY18, assuming 8,000 students register for Summer Learning, with 120 treatment classrooms and 341 control classrooms, where power = 0.8, we are able to detect a change of 2.62 days in the number of absent days; a change of 0.08 in the number of courses failed; a change of 0.12 standard deviations in standardized math score; and a change of 0.13 standard deviations in standardized reading score. When using the mock sample of students with the most math and reading course failures in SY18, with the same assumptions, we are able to detect a change of 2.04 days in the number of absent days; a change of 0.09 in the number of courses failed; a change of 0.16 standard deviations in standardized math score; and a change of 0.14 standard deviations in standardized reading score. We also test whether we can detect a change in summer program failure rate, assuming the control group has a failure rate of 50%, and we are able to detect a 11-percentage point change in summer program failure rate. MDE (for 8,000 students, 120 treatment classrooms, 341 control) SY19 absences using most SY18 absences sample: 2.617067 SY19 course failures using most SY18 absences sample: 0.0758736 SY19 standardized math score using most SY18 absences sample: 0.1239422 SY19 standardized reading scores using most SY18 absences sample: 0.1253976 SY19 absences using most SY18 course failures sample: 2.04071 SY19 course failures using most SY18 course failures sample: 0.0859007 SY19 standardized math score using most SY18 course failures sample: 0.1587865 SY19 standardized reading score using most SY18 course failures sample: 0.143222 Summer program failure rate, control rate 50 percent: 0.1067743
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
The University of Chicago Social and Behavioral Sciences Institutional Review Board
IRB Approval Date
2020-08-21
IRB Approval Number
IRB20-1124
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information