Back to History Current Version

Messaging to Improve Phone Survey Response Rates

Last registered on December 10, 2020


Trial Information

General Information

Messaging to Improve Phone Survey Response Rates
Initial registration date
July 08, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 13, 2020, 3:51 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
December 10, 2020, 11:50 AM EST

Last updated is the most recent time when changes to the trial's registration were published.



Primary Investigator

Innovations for Poverty Action

Other Primary Investigator(s)

PI Affiliation
Northwestern University
PI Affiliation
Northwestern University
PI Affiliation
Innovations for Poverty Action
PI Affiliation
Innovations for Poverty Action

Additional Trial Information

On going
Start date
End date
Secondary IDs
Substantial literature on survey response rates focuses on framing and the appeal to altruism as a motivation for participating, with methods like pre-survey post-cards and letters to incentivize cooperation, with evidence coming primarily from the U.S. and Europe. More recently, there has been interest in mobile phone surveys in low and middle income countries, where the efficacy of methods for improving response rates is not as well known. This study randomizes the use of pre-survey text messages, whether to send them and which type of appeal to make. The study also randomizes the messaging used in the consent script, appealing alternatively to “researcher” or “government” as the motivating authority, in the first round of experiments, and appeals to efficacy of participation and to self-interest with reminders about monetary compensation. The experiment is conducted in 11 random-digit dial (RDD) surveys in 10 countries, with followup surveys in 5 of those surveys.
External Link(s)

Registration Citation

Dillon, Andrew et al. 2020. "Messaging to Improve Phone Survey Response Rates." AEA RCT Registry. December 10.
Experimental Details


The experiment varied two factors

Factor 1 is SMS text message sent to respondent prior to CATI interview call, with 3 possible levels:
S0 = No SMS
SG= SMS, appeal to "government"
SR= SMS, appeal to "researcher"

Factor 2 is appeal in the consent script, with 3 levels:
G = consent appeals to "government"
R = consent appeals to "researcher"
P = consent appeals to "policymaker"

In Colombia and Mexico, it was a 2x2 (no cases assigned to S0 or P)
In the other wave 1 countries it was a 3x1 (S, G, or P)
"Mixed message" cells (SG-R, SG-P, SR-G, SR-P, etc.) are not populated.
In Spanish-speaking countries, "policymaker" appeals are omitted because terminology is hard to distinguish from government.

In Wave 2 surveys, the treatment factor structure is altered and different messaging contrasts are planned. The pre-survey SMS message has the following appeal type for each treatment arm:
A1 Placebo/short message -- no specific appeal
A2 General Learning (Food access), says that the first wave of the survey was informative about food access
A3 General Learning (Household finances), says the first wave of the survey related to household finances
A4 Specific Learning (Food access), same as above but shared a statistic about food access
A5 Specific Learning (Household finances), same as above, but shared a statistic about household finances
B1 Self Interest, reminded respondents about the monetary incentive
B2 General Learning (Food access) + Self interest, same as A2, but included message about monetary incentive
B3 General Learning (Household finances) + Self interest, same as A3 but included message about monetary incentive
B4 Specific Learning (Food access) + Self interest = A4 + monetary incentive message
B5 Specific Learning (Household finances) + Self interest = A5 + monetary incentive message
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Survey consents and completions
Primary Outcomes (explanation)
Consent and survey completion are binary, straightforward to code.

Secondary Outcomes

Secondary Outcomes (end points)
Contact rates (for SMS arms)
Respondent attention
Response distributions for key survey items: mask-wearing behavior and hand-washing behavior
Secondary Outcomes (explanation)
Respondent attention is measured by interviewer-coded items that are not read aloud, asking to rate respondent attention at two points during the survey, on a four-point scale: "Very attentive", "Somewhat attentive", "Somewhat distracted", and "Very distracted". Interviewer training materials spell out the details to help interviewers define and operationalize these concepts. In the analysis stage these responses are collapsed into a binary attentive/distracted measure.

Experimental Design

Experimental Design
Randomization is built into the SurveyCTO case management system. Respondents are randomized into treatment arms and are subsequently sent either an SMS with the assigned messaging or no SMS, and subsequently if the are contacted, are read a script that has one of the randomly determined appeals. The study uses paradata collected as part of the survey as well as survey responses.
Experimental Design Details
Randomization Method
Randomization programmed in SurveyCTO.
Randomization Unit
Individual cases. Each case is a phone number for an individual.
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
The survey is RDD and the number of working numbers cannot be known ahead of time with certainty. We estimate that we will have about 50,000 cases across countries/surveys (countries are strata, not clusters)
Sample size: planned number of observations
50000 cases
Sample size (or number of clusters) by treatment arms
Varies by country/survey, but approximately 12000 cases in each treatment arm for the Wave 1 experiment.
The Wave 2 experiment will be conducted with followups from the RDD survey. This sample will be substantially smaller, approximately 6,000 total spread over 10 treatment arms, with unequal cell sizes ranging from 8.3% (~500 cases) to 16.7% (~1,000 cases).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Binary outcome (response rate): approximately 2.7 percentage points with Dunn-Bonferroni adjustment, assuming 4 tests for Wave 1 experiment. For the Wave 2 experiment, we estimate the minimum detectable effect size for the contrast with the smallest sample will be 0.104, assuming a binary outcome with control group mean of 0.50 and using a Dunn-Bonferroni correction for multiple hypothesis tests.

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number
N/A Board chair approved blanket language to be incorporated into several separate IRB submissions to cover methods experiments that imposed no new burden on respondents
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Intervention Completion Date
June 30, 2020, 12:00 +00:00
Data Collection Complete
Data Collection Completion Date
June 30, 2020, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
6,826 (no clustering)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
SMS Treatment Arm Freq. Percent Cum. SMS (Information) 1,056 15.48 15.48 SMS (Information/Incentive) 1,050 15.39 30.88 SMS (Learning/Incentive General) 592 8.68 39.55 SMS (Learning General - Household Finan 583 8.55 48.10 SMS (Learning General - Food Security) 591 8.66 56.77 SMS (Learning Specific - Household Fina 593 8.69 65.46 SMS (Learning Specific - Food Security) 589 8.64 74.09 SMS (Learning/Incentive General - House 585 8.58 82.67 SMS (Learning/Incentive Specific - Hous 591 8.66 91.34 SMS (Learning/Incentive Specific - Food 591 8.66 100.00 Total 6,821 100.00
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials