Targeting Nudges to Students in the Pandemic: An Adaptive Experiment in Brazil

Last registered on February 17, 2021

Pre-Trial

Trial Information

General Information

Title
Targeting Nudges to Students in the Pandemic: An Adaptive Experiment in Brazil
RCT ID
AEARCTR-0007152
Initial registration date
February 16, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
February 17, 2021, 10:36 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Zurich

Other Primary Investigator(s)

PI Affiliation
Movva
PI Affiliation
University of Zurich
PI Affiliation
University of Zurich

Additional Trial Information

Status
In development
Start date
2021-02-17
End date
2021-07-31
Secondary IDs
Abstract
A growing literature documents that nudges to students and their caregivers – from reminders to encouragement messages – can systematically improve educational outcomes. Having said that, when there is substantial heterogeneity in treatment effects, policy-makers have very little guidance on what is the best nudge for each student. In partnership with the São Paulo State Secretariat of Education (SEDUC-SP), in Brazil, this project implements an adaptive treatment protocol to assess heterogeneity in the responses to motivational nudges via text messages, and thereby to assign optimal messaging to students based on their observed characteristics. In a previous experiment (Bettinger et al., 2021) during school closures in the context of the pandemic, students were randomized to motivational nudges via text messages that targeted, specifically, (i) salience of school activities, or (ii) growth mindset, or (iii) beliefs about high returns to effort, or (iv) beliefs about low costs of effort, or (v) risk-taking, or (vi) future-orientation. We have built a machine learning model to predict student-level responses to each treatment arm – short-term effects on access to remote learning activities, and effects on attendance and grade by the end of the school year –, based on their individual characteristics. In this follow-up experiment, we use the ML-predicted ranking of treatments to assign each individual to their personally optimal messages. We evaluate impacts of being assigned the best treatment on average versus the personally optimal treatment on school attendance, grades and school dropouts.
External Link(s)

Registration Citation

Citation
Ash, Elliott et al. 2021. "Targeting Nudges to Students in the Pandemic: An Adaptive Experiment in Brazil." AEA RCT Registry. February 17. https://doi.org/10.1257/rct.7152-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Bettinger et al. (2021) is evaluating the impacts of several versions of the growth mindset intervention (Yeager, 2019; Bettinger et al., 2018) via text messages to students and their caregivers. The experiment tries to decompose the original intervention into its underlying economic parameters in trying to single out what are the key drivers of its impacts on educational outcomes (if any). Concretely, the interventions are as follows:

(i) Growth mindset: text messages that try to convey the content of the original intervention, that communicates students that their brain is ‘like a muscle’, and as such can ‘become stronger’ as a result of higher effort, that everyone can improve relative to themselves, and that success (failure) is not merely a matter of talent (lack of thereof).

(ii) Salience of school activities: text messages with simple reminders from the school; placebo intervention whereby text messages try to make school activities more salient, without affecting beliefs, risk or time preferences.

(iii) High returns to effort: text messages that emphasize that higher effort leads to better educational outcomes.

(iv) Low costs of effort: text messages that emphasize that studying is not that hard and might even be fun.

(v) Risk-taking: text messages that emphasize the value of taking risks.

(vi) Future-orientation: text messages that emphasize the value of thinking about one’s future.

Over the course of the 3rd quarter of 2020, those interventions were piloted at scale, whereby 800,000 students received a single text message, whose content was randomized to one out of the 6 groups above. The project continues into 2021, evaluating the impacts of the interventions delivered through two SMS per week over the course of three months over the course of the first semester.

In 2021, Movva, our implementing partner, produced 6 different sequences of motivational nudges via text messages to be sent over the course of the 1st and 2nd school quarters to 240,000 students. Each sequence is inspired in one of the interventions in Bettinger et al. (2021), although they differ from that study in that they vary in other dimensions (such as the activities they suggest) which could not be changed in the context of a controlled experiment.

In this follow-up experiment, we estimate heterogeneous treatment effects of the interventions piloted in 2020 to assign the nudges inspired on those interventions based on their predicted returns. On the one hand, we have the average treatment effects of each of the 6 treatment arms of the 2020 pilot. On the other hand, we form the predicted conditional average treatment effects of these groups for any given individual, based on their observed pre-treatment characteristics.

Because short-term treatment effects on access to online remote learning activities and those at the end of the school year on attendance and grades are very different (both when it comes to average and conditional average treatment effects), we record those estimates separately, and experiment with optimal targeting based on different outcomes as well.
Intervention Start Date
2021-02-17
Intervention End Date
2021-06-30

Primary Outcomes

Primary Outcomes (end points)
- Weekly access to the distance learning platform;
- Weekly time online on the distance learning platform;
- Weekly attendance in-person attendance (by school subject);
- Quarterly grades (by school subject);
- Student dropouts.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The basic design of this new experiment comprises 5 groups:

1) A random assignment group, in which students are randomly assigned to 1 out of the 6 sequences inspired by the interventions in the pilot experiment;

2) A standard targeting based on online access group, in which students are assigned to the sequence inspired in the intervention with the highest average treatment effect in the pilot, based on treatment effects on access to online activities;

3) A standard targeting based on attendance and grades group, in which students are assigned to the sequence inspired in the intervention with the highest average treatment effect in the pilot, based on treatment effects on 4th-quarter attendance and grades;

4) An individual targeting based on online access group, in which students are assigned to the sequence inspired in the intervention with the the highest conditional average treatment effect, learned from the pilot results and predicted using the new cohort’s personal characteristics, based on treatment effects on access to online activities; and

5) An individual targeting based on attendance and grades group, in which students are assigned to the sequence inspired in the intervention with the the highest conditional average treatment effect, learned from the pilot results and predicted using the new cohort’s personal characteristics, based on treatment effects on 4th-quarter attendance and grades.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
Classroom level
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
7,877 classrooms
Sample size: planned number of observations
240,858 students
Sample size (or number of clusters) by treatment arms
1) Random assignment group: 1,183 classrooms (47,561 students)
2) Standard targeting based on online access: 1,187 classrooms (48,761 students)
3) Individual targeting based on online access: 1,203 classrooms (46,464 students)
4) Standard targeting based on attendance and grades: 1,182 classrooms (49,505 students)
5) Individual targeting based on attendance and grades: 1,190 classrooms (48,567 students)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Zurich
IRB Approval Date
2020-09-07
IRB Approval Number
OEC IRB # 2020-057
Analysis Plan

Analysis Plan Documents

PAP_Feb2021.pdf

MD5: 327564239a3a0b34c0f3fa83fc24057d

SHA1: 33a7f1c3a3cfb087f7a41d889b83c6586518aa62

Uploaded At: February 16, 2021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials