x

The AEA RCT Registry will be down for maintenance on Tuesday, April 20th, from 9pm EDT to 11pm EDT to perform necessary upgrades. The site will be in maintenance mode during that time. We apologize for any inconvenience this may cause.
Optimizing Protocols for Random Digit Dial Surveys
Last registered on January 04, 2021

Pre-Trial

Trial Information
General Information
Title
Optimizing Protocols for Random Digit Dial Surveys
RCT ID
AEARCTR-0006882
Initial registration date
December 29, 2020
Last updated
January 04, 2021 9:10 AM EST
Location(s)
Region
Region
Region
Region
Region
Region
Region
Region
Region
Primary Investigator
Affiliation
Innovations for Poverty Action
Other Primary Investigator(s)
PI Affiliation
Northwestern University
Additional Trial Information
Status
On going
Start date
2020-05-05
End date
2021-09-30
Secondary IDs
Abstract
In empirical development economics, it is customary to collect data via face to face household surveys. However, the COVID-19 pandemic and other factors have encouraged a shift to phone surveys for this work, where the research literature on mobile phone surveys in low- and middle-income countries is thin. Much of the literature is also out of date, given the rapid changes in mobile cell phone penetration even among poor households as digital financial tools are used more and more for social protection interventions.

In this study, IPA takes advantage of phone surveys launched in nine countries to embed experiments on the best times and days to initiate phone surveys and the optimal number of attempts for high response rates data quality. We use random digit dial surveys, for which it is automatic to randomly assign cases (initial attempts) to time of day and day of week. By definition, most other variables in call protocols are also randomized because cases (phone numbers to attempt) are dialed in random order.
External Link(s)
Registration Citation
Citation
Dillon, Andrew and Steven Glazerman. 2021. "Optimizing Protocols for Random Digit Dial Surveys." AEA RCT Registry. January 04. https://doi.org/10.1257/rct.6882-1.0.
Experimental Details
Interventions
Intervention(s)
Vary time of day, day of week, and maximum number of attempts

Time of day is recorded in precise units with timestamp, but for the analysis we group calls into morning (before 12 PM), mid-day (12PM to 3:49PM), and evening (4 PM to 10:59 PM)
For day of week, each day is extracted from timestamp data, but Saturday and Sunday are combined into a single weekend indicator
Intervention Start Date
2020-05-05
Intervention End Date
2021-06-30
Primary Outcomes
Primary Outcomes (end points)
Pickup -- did the respondent answer the phone?
Survey completion - did the respondent complete the interview?
Sample composition (demographic characteristics of respondents)
Primary Outcomes (explanation)
Demographic composition is conditional on having completed the survey.
Age is continuous (years)
Education is collapsed into an indicator for secondary education completed or not.
Gender is self-reported.
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Units of analysis are individual cases. A case is a valid phone number (registered SIM card) that could result in a completed phone interview.
Interviewers are assigned cases at random from a list provided by the mobile network operator or a third-party sample provider such as Sample Solutions, who pre-pulse the numbers to verify that they are working, active numbers.
The survey dialer randomly assigns cases over a work week (5, 6, or 7 days) and paradata are recorded for each call attempt. These include time of day, day of week, interviewer ID, and attempt number. We will analyze the first attempt for time-of-day and day-of-week analysis, because all actions taken on subsequent attempts are influenced by call center staff behavior and depend on respondent behavior (e.g. reaching the second attempt requires someone to ignore a call or refuse cooperation on the first attempt).
Analysis is straightforward comparison of pickup and completion rates by day and time.
Experimental Design Details
Not available
Randomization Method
Randomization is done with software, by the dialer program within SurveyCTO.
Randomization Unit
Units are "cases", where each case is a pre-pulsed phone number.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
62,663 cases
Sample size: planned number of observations
62,663 cases Conditional analyses will have smaller sample sizes. Analysis of completions conditional on pickup, N = 27,945 Analysis of respondent demographics is based on respondents who completed the survey, N = 7,953
Sample size (or number of clusters) by treatment arms
Time of day: Morning 16,230; Mid-day 27,885; Evening 18,548
Day of week: Range from 8,200 for weekends to 10,970 on Mondays, around 10,000 each weekday
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
IPA IRB
IRB Approval Date
2020-05-05
IRB Approval Number
N/A Board chair approved blanket language to be incorporated into several separate IRB submissions to cover methods experiments that imposed no new burden on respondents