Bettinger et al. (2021) is evaluating the impacts of several versions of the growth mindset intervention (Yeager, 2019; Bettinger et al., 2018) via text messages to students and their caregivers. The experiment tries to decompose the original intervention into its underlying economic parameters in trying to single out what are the key drivers of its impacts on educational outcomes (if any). Concretely, the interventions are as follows:
(i) Growth mindset: text messages that try to convey the content of the original intervention, that communicates students that their brain is ‘like a muscle’, and as such can ‘become stronger’ as a result of higher effort, that everyone can improve relative to themselves, and that success (failure) is not merely a matter of talent (lack of thereof).
(ii) Salience of school activities: text messages with simple reminders from the school; placebo intervention whereby text messages try to make school activities more salient, without affecting beliefs, risk or time preferences.
(iii) High returns to effort: text messages that emphasize that higher effort leads to better educational outcomes.
(iv) Low costs of effort: text messages that emphasize that studying is not that hard and might even be fun.
(v) Risk-taking: text messages that emphasize the value of taking risks.
(vi) Future-orientation: text messages that emphasize the value of thinking about one’s future.
Over the course of the 3rd quarter of 2020, those interventions were piloted at scale, whereby 800,000 students received a single text message, whose content was randomized to one out of the 6 groups above. The project continues into 2021, evaluating the impacts of the interventions delivered through two SMS per week over the course of three months over the course of the first semester.
In 2021, Movva, our implementing partner, produced 6 different sequences of motivational nudges via text messages to be sent over the course of the 1st and 2nd school quarters to 240,000 students. Each sequence is inspired in one of the interventions in Bettinger et al. (2021), although they differ from that study in that they vary in other dimensions (such as the activities they suggest) which could not be changed in the context of a controlled experiment.
In this follow-up experiment, we estimate heterogeneous treatment effects of the interventions piloted in 2020 to assign the nudges inspired on those interventions based on their predicted returns. On the one hand, we have the average treatment effects of each of the 6 treatment arms of the 2020 pilot. On the other hand, we form the predicted conditional average treatment effects of these groups for any given individual, based on their observed pre-treatment characteristics.
Because short-term treatment effects on access to online remote learning activities and those at the end of the school year on attendance and grades are very different (both when it comes to average and conditional average treatment effects), we record those estimates separately, and experiment with optimal targeting based on different outcomes as well.