Quantifying and Addressing Coverage Bias in Phone Surveys

Last registered on April 27, 2021

Pre-Trial

Trial Information

General Information

Title
Quantifying and Addressing Coverage Bias in Phone Surveys
RCT ID
AEARCTR-0007608
Initial registration date
April 26, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 27, 2021, 7:11 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Innovations for Poverty Action

Other Primary Investigator(s)

PI Affiliation
Innovations for Poverty action

Additional Trial Information

Status
Completed
Start date
2021-01-28
End date
2021-03-31
Secondary IDs
Abstract
With pandemic conditions making face-to-face data collection too risky or costly for most countries, global poverty researchers must rely heavily on mobile phone surveys. The main weakness of phone surveys is the difficulty achieving a representative sample (Gibson et al. 2017). Coverage bias will typically be correlated with outcomes of interest, introducing bias into most analyses of the survey data if not accounted for. This study takes advantage of a useful opportunity to test methods for addressing this sort of coverage bias in phone surveys, in which researchers are conducting a phone survey of a large sample of enrollees in a digital cash transfer program managed by a government agency. The researchers have baseline data for each potential respondent, including location, household demographics, and economic outcomes of interest from the original cash transfer program. Potential respondents are randomly placed into either high-intensity or low-intensity contact protocols, similar to the approach recommended in DiNardo et al. (2021), generating random variation in the probability that a respondent will end up in the sample, and allowing researchers to assess the marginal value of each follow-up call, given they did not answer on the previous attempt. Using the economic features in the baseline data as outcomes, we can then assess the effectiveness of various weighting schemes designed to adjust for coverage bias.
External Link(s)

Registration Citation

Citation
Collins, Elliott and Steve Glazerman. 2021. "Quantifying and Addressing Coverage Bias in Phone Surveys." AEA RCT Registry. April 27. https://doi.org/10.1257/rct.7608-1.0
Experimental Details

Interventions

Intervention(s)
Prior to starting phone calls, the experiment separates the call list of cash transfer recipient phone numbers (N=5000) evenly into three groups:
- Control Group numbers receive one call in each of three time blocks over a single day, after which they are replaced.
- SMS Group numbers follow the control protocol, but with an SMS message sent in advance of the second call explaining who is calling and why.
- High Intensity Group numbers follow the same protocol as the SMS Group, but over two days before being replaced, rather than one.

Another 1,500 phone numbers were selected as replacement numbers.

Treatment status was stratified by Region, Financial Service Provider (FSP) used by the recipient, and the date of the initial cash transfer, and replacement is conducted entirely within-strata.
Intervention Start Date
2021-01-28
Intervention End Date
2021-03-31

Primary Outcomes

Primary Outcomes (end points)
1. Survey Consents and Completions
2. Average age, gender, location, and economic profile of respondents
Primary Outcomes (explanation)
The "Economic Profile" of respondents will be characterized using the most substantive economic features which are available in the baseline data or possible to construct ex post using location and demographic profiles. It remains to be seen which of these variables will be most informative (e.g. income was collected in the baseline dataset, but possibly with significant measurement error), but may include any transaction data, income status, and the community- or neighborhood-level poverty rate of the respondent's registered address.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Prior to starting phone calls, the experiment separates the call list of cash transfer recipient phone numbers (N=5000) evenly into one of the three treatment groups. Another 1,500 phone numbers are selected as replacement numbers. Treatment status was stratified by Region, Financial Service Provider (FSP) used by the recipient, and the date of the initial cash transfer, and replacement is conducted entirely within-strata.

Researchers will first characterize the differences in demographic and economic profile between the full sample (based on baseline data) and the phone sample (those who ultimately answered the phone and completed a survey). The analysis will then estimate the marginal response rate for each additional call made to unresponsive numbers, with response rates calculated according to AAPOR guidelines, providing an estimate of the value of recruitment intensity for increasing sample size in phone surveys. The demographic differences observed among those contacted on the first, second, and third attempts will be used to characterize the value of recruitment intensity for addressing coverage bias.
Experimental Design Details
Randomization Method
Randomization was done using a random number generator by the same team that managed data collection and cleaning.
Randomization Unit
Randomization was conducted at the individual level within strata.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
5000 phone survey respondents.
Sample size: planned number of observations
5000 phone survey respondents.
Sample size (or number of clusters) by treatment arms
Each sample contains 1666 or 1667 phone numbers.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Innovations for Poverty Action Institutional Review Board
IRB Approval Date
2021-01-05
IRB Approval Number
N/A

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials