Understanding the effects of text reminders on reducing churn in supplemental assistance programs

Last registered on January 19, 2022

Pre-Trial

Trial Information

General Information

Title
Understanding the effects of text reminders on reducing churn in supplemental assistance programs
RCT ID
AEARCTR-0005227
Initial registration date
January 03, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 03, 2020, 3:43 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
January 19, 2022, 9:08 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Annenberg Institute at Brown University

Other Primary Investigator(s)

PI Affiliation
Annenberg Institute at Brown University
PI Affiliation
Annenberg Institute at Brown University

Additional Trial Information

Status
In development
Start date
2022-02-03
End date
2023-02-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
"Churning” is when eligible transitional benefit program participants fail to renew, temporarily dropping out of their program before reentering within four months. Churn is a widespread issue across transitional benefit programs, negatively impacting participants and administering agencies. This proposed intervention will assess whether a low-cost scalable texting program can reduce churn for the Supplemental Nutrition Assistance Program. Participants in one treatment group will receive text messages that simplify the interim report and re-certification process with reminders, information, and direct links. A second treatment arm will receive reminders only, while a control group will experience the current procedures of the Massachusetts Department of Transitional Assistance. The text messages aim to reduce informational and behavioral barriers program participants face and thus lessen churn. We estimate an overall sample size around 60,000 over the course of three months. We will iterate three times during the project, testing timing of the texts and identifying heterogeneous effects. The primary outcome of interest is churn probability. We will also assess completing process steps on time rates, submitting complete forms or verification rates, interview completion rates (including as-scheduled and post-missed interview), approval and closure rates, and the average number of days that forms are submitted before deadlines.
External Link(s)

Registration Citation

Citation
Loeb, Susanna, Samuel Madison and Katharine Meyer. 2022. "Understanding the effects of text reminders on reducing churn in supplemental assistance programs." AEA RCT Registry. January 19. https://doi.org/10.1257/rct.5227-1.1
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
“Churning” is when eligible transitional benefit program participants fail to renew, temporarily dropping out of their program before reentering within four months. Churn is a widespread issue across transitional benefit programs, negatively impacting participants and administering agencies. This proposed intervention will assess whether a low-cost scalable texting program can reduce churn for the Supplemental Nutrition Assistance Program. Participants in one treatment group will receive text messages that simplify the interim report and recertification process with reminders, information, and direct links. A second treatment arm will receive reminders only, while a control group will experience the current procedures of the Massachusetts Department of Transitional Assistance. The text messages aim to reduce informational and behavioral barriers program participants face and thus lessen churn. We estimate an overall sample size around 60,000 over the course of three months. We will iterate three times during the project, testing timing of the texts and identifying heterogeneous effects. The primary outcome of interest is churn probability. We will also assess completing process steps on time rates, submitting complete forms or verification rates, interview completion rates (including as-scheduled and post-missed interview), approval and closure rates, and the average number of days that forms are submitted before deadlines.
Intervention Start Date
2022-02-17
Intervention End Date
2023-02-01

Primary Outcomes

Primary Outcomes (end points)
The primary outcome of interest is churn probability. We will also assess completing process steps on time rates, submitting complete forms or verification rates, interview completion rates (including as-scheduled and post-missed interview), approval and closure rates, and the average number of days that forms are submitted before deadlines.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
For this intervention, we will randomly assign SNAP benefit participants who are due for their interim report or re-certification and are thus at risk of churning into groups. The first treatment group will receive text message reminders about key steps, a phone number that they can call to receive help, and hyperlinks to the website and mobile application where clients can submit their forms. This treatment addresses informational constraints such as lack of information on how and where to file the paperwork. It also addresses the behavioral barriers created by cognitive load by including easy links and holds client attention through reminders. A second treatment arm will receive reminders only. All text messages will include a note that recipients can “Text STOP to end messages.” The control group consists of participants receiving the standard letters and automated phone calls currently in effect.

This research design answers these key questions:
(1) can a text-messaging program reduce churn; and
(2) do reminders drive the effects or is a reduction in cognitive load through easy to implement steps necessary to reduce churn.

The comparison of the two treatments provides evidence on whether reminders alone are beneficial or whether easing the processes and reducing the cognitive load are necessary. We will run the intervention for three months and then iterate on the texts, aiming to identify particularly promising practices. The large sample and the ability to iterate will allow us to explore whether the time of day or day of the week for texts effect their impact, as was found for parent texting; what the optimal number of text messages is; and whether certain groups respond differentially including groups by age, location, children, work history, and race/ethnicity. Texting programs are unusually easy to implement with fidelity and we are using available data.

All SNAP recipients who are due for their interim report or re-certification and who have provided their phone number to DTA will be considered for this intervention. We have only limited information on the opt-out rates, but believe that 20 percent is a conservative estimate, given opt-out from other studies. While DTA’s caseload fluctuates, a conservative estimate of approximately 10,000 re-certification forms and 15,000 Interim Reports (IRs) will be due every month for SIMP-12 SNAP clients. With an opt out rate of 20 percent, we will have a monthly sample size of around 8,000 household heads (10,000 re-certification due x 80%=8,000) who will need to complete re-certification forms, and of 12,000 household heads (15,000 IRs due x 80%=12,000) who will need to complete an IR.

Given the nature of this project, we intend to treat the maximum number of eligible participants. We intend to first field this intervention for three months (February 2020 through May 2020) and estimate an overall sample size around 51,000 clients who DTA has permission to communicate with via text message. Using Optimal Design based on a single level person randomized trial for the calculations, we expect to be able to find acceptable minimal detectable effects sizes (MDE) with this sample. The MDE of a binary treatment comparison (C vs T1, C vs T2, or T1 vs T2) for a power of 80 percent, a significance level of 5 percent, and sample sizes of between 60 percent and a 20 percent opt out rate. We anticipate these opt-out rates are much higher than we will encounter with the intervention, since clients have already had the opportunity to opt-out of text message communication. In the worst expected case, with an opt-out rate of 60 percent (51,000*0.4=20,400), comparing two treatment groups (20,400*2/3=13,600) and an R2 of three percent, we will be able to detect effects of a size of 0.047 standard deviations (SD) or larger for any binary treatment comparison. In more likely scenarios, we will be able to detect effect sizes as low as 0.028 SDs. Given a historic average churn of 22 percent (SD=0.414), these effect sizes translate to detecting MDEs between 1.2-1.9 percentage points. Previous related studies find larger or similar effect sizes.

We will estimate the average treatment effect in a straightforward linear model:
Y_i=β_0+β_1 T_i^1+β_2 T_i^2+X_i β_3+ε_i where Y_i is the outcome of interest of household head i (submission of re-certification form or interim report, SNAP benefit renewal), and T_i^1 and T_i^2 are binary indicators for treatment groups. We will include participant background characteristics, X_i, to improve precision but they should not affect point estimates.
Experimental Design Details
Randomization Method
We will randomly assign SNAP benefit participants who are due for their interim report or re-certification and are thus at risk of churning into three groups using statistical software in the Annenberg Institute at Brown University office using secure computers:

(T1) The first treatment group will receive text message reminders about key steps, a phone number that they can call to receive help, and hyperlinks to the website and mobile application where clients can submit their forms.

(T2) A second treatment arm will receive text reminders only.

(C) A control group consisting of participants receiving the standard letters and automated phone calls currently in effect.

All SNAP recipients who are due for their interim report or re-certification and who have provided their phone number to DTA will be considered for this intervention.

Randomization Unit
We will randomly assign SNAP benefit participants who are due for their interim report or re-certification, and are thus at risk of churning, at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Given the nature of this project, we intend to treat the maximum number of eligible participants. We intend to first field this intervention for three months (February 2020 through May 2020) and estimate an overall sample size around 51,000 clients who DTA has permission to communicate with via text message.
Sample size: planned number of observations
Our planned number of observations is the same as our planned number of clusters due to the fact that the designed is not clustered.Given this, we again intend to treat the maximum number of eligible participants. We intend to first field this intervention for three months (February 2020 through May 2020) and estimate an overall sample size around 51,000 clients who DTA has permission to communicate with via text message.
Sample size (or number of clusters) by treatment arms
We aim to evenly split our randomized sample across Treatment arm 1/Treatment arm 2/Control arm, and estimate approximately 17,000 participants each in the control (C), reminder (T2), and reminder+action conditions (T1).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Using Optimal Design based on a single level person randomized trial for the calculations, we expect to be able to find acceptable minimal detectable effects sizes (MDE) with this sample. Our power calculation shows the MDE of a binary treatment comparison (C vs T1, C vs T2, or T1 vs T2) for a power of 80 percent, a significance level of five percent, and sample sizes of between a 60 percent and a 20 percent opt-out rate. We anticipate these opt-out rates are much higher than we will encounter with the intervention, since clients have already had the opportunity to opt-out from text message communication. The two lines on the graph, attached in the pre-analysis plan, represent R2s explained by covariates of three percent (as found in previous analyses using a limited set of covariates) and 15 percent. We present a range of R2s because we will have a full set of covariates available in our estimation, which will likely result in an R2 higher than three percent. In the worst expected case, with an opt-out rate of 60 percent ( 51,000 *0.4=20,400), comparing two treatment groups ( 20,400 *2/3= 13,600) and an R2 of three percent, we will be able to detect effects of a size of 0.047 standard deviations (SD) or larger for any binary treatment comparison. In more likely scenarios, we will be able to detect effect sizes as low as 0.028 SDs. Given a historic average churn of 22 percent (SD=0.414), these effect sizes translate to detecting MDEs between 1.2-1.9 percentage points.
IRB

Institutional Review Boards (IRBs)

IRB Name
Brown University IRB, Human Research Protection Program
IRB Approval Date
2019-12-10
IRB Approval Number
The Brown IRB formally determined that this project does not meet the regulatory definition of human subjects research; therefore, there is no specific IRB approval number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials