x

NEW UPDATE: Completed trials may now upload and register supplementary documents (e.g. null results reports, populated pre-analysis plans, or post-trial results reports) in the Post Trial section under Reports, Papers, & Other Materials.
The effect of survey design and mode on completion rates and data quality for remote surveys in low-income settings.
Last registered on June 12, 2020

Pre-Trial

Trial Information
General Information
Title
The effect of survey design and mode on completion rates and data quality for remote surveys in low-income settings.
RCT ID
AEARCTR-0006001
Initial registration date
June 12, 2020
Last updated
June 12, 2020 12:55 PM EDT
Location(s)
Region
Primary Investigator
Affiliation
Busara Center for Behavioral Economics
Other Primary Investigator(s)
Additional Trial Information
Status
On going
Start date
2020-05-28
End date
2020-07-15
Secondary IDs
Abstract
Remote surveys in low-income settings, particularly in East Africa, are common, though currently include a range of common practices without clear best practices and are predominantly the purview of market researchers, with limited application in academic research practices to-date. Yet, differences in survey response rates and data quality across different modes of remote data collection and survey design features is of key importance as remote research becomes more popular in light of COVID-19, both for academics and organizations. This study tests for differences in response rates, data quality and substantive responses across two of the most common modes of remote data collection - phone and SMS surveys - and across core features of the survey design - question order, pre-communication, and completion incentive. This study is conducted using a survey on COVID-19 in Kenya amongst a low-income urban population, and the substantive content covers economic, psychological, and physical effects of COVID and a set of basic economic preference outcomes. The study aims to give robust evidence to validate methods for researchers and provide best guidelines for optimal survey design.
External Link(s)
Registration Citation
Citation
Owsley, Nicholas Calbraith. 2020. "The effect of survey design and mode on completion rates and data quality for remote surveys in low-income settings.." AEA RCT Registry. June 12. https://doi.org/10.1257/rct.6001-1.0.
Experimental Details
Interventions
Intervention(s)
"Interventions" in this case include the range of survey design features which are randomly varied across administered survey invitations and surveys. Treatment assignment is to one variation of each of the 4 below treatment sets for survey design, according to blocked random assignment.

1. Mode
Participants are assigned to complete the survey using the following modes and platforms with a Phone: SMS assignment ratio at 1:3. Each list of participants is also randomly assigned across days and times so as to create the maximum possible overlap in implementation times across modes.

a. Phone survey
Survey conducted over phone by an enumerator using SurveyCTO as a data capture tool.

b. SMS
Survey conducted over SMS through the Telerivet SMS poll platform.


2. Incentives
Participants are assigned to receive one of the below incentive amounts, conditional on completion of the 4th module in the survey (approximately 60% of questions), in the ratio of 6:4:3 for 300Ksh: 150 Ksh: 50 Ksh. In each of the below treatments, the amount is communicated in the initial script in the same way across modes during consenting.

a. Micro-incentive/reimbursement: Ksh 50 incentive

b. Reasonable incentive for length of survey: Ksh 150

c. Higher than average incentive for length of survey: Ksh 300


3. Survey Pre-communication
Each participant is randomly assigned to receive the following pre-communication regarding the survey in equal proportions. Please see here for a more detailed protocol.

a. No SMS
Respondents do not receive an SMS. The survey itself is the first communication they receive.

b. Plain SMS warning about the survey.
Respondents receive an SMS on the morning of or morning prior to the survey to indicate that the respondent will be invited to participate in a survey. The script is as follows:


Order Effects
The survey includes 2 modules which vary in position across the surveys 7 modules,, each with 1) a likert question regarding COVID sentiment, 2) a continuous response preference measure, and 3) a data quality measure. These modules are numbered 2 and 6. Each will be randomly assigned to be either early in the survey or late in the survey, as shown below:

a. Order 1
Module 2 → 6: The order of the survey will be -
Module 1
Module 2
Module 3, 4, 5
Module 6
Module 7

b. Order 2
Module 6 → 2: The order of the survey will be -
Module 1
Module 6
Module 3, 4, 5
Module 2
Module 7
Intervention Start Date
2020-05-28
Intervention End Date
2020-06-17
Primary Outcomes
Primary Outcomes (end points)
1. Logistics and meta-survey
Consent rates (respondents indicating ‘yes’ in consent).
Completion rates (respondents completing all questions).

2. Response quality:
Instructional manipulation check
Response consistency between the same question in two parts of the survey
Recall within the survey of detail of previous content
Self report of low attention
‘Skip’ patterns (selecting same option repeatedly; all m3_3_a-d and all m4_1_a-d ==1)

Final response quality variables will depend in part on outcome variation and two approaches will be taken: i) some outcomes will be dropped showing lack of variation, ii) outcomes that can plausibly be transformed into an index that meets internal consistency requirements will be transformed.

3. Response distributions:
Altruism (1 question; continuous distribution)
Risk (1 question; continuous distribution)
COVID-19 perceptions (2 questions; likert)
PHQ4 index (4 questions, combined; likert index)
GBV index (3 questions, combined; likert index)
Economic impact index (4 questions, combined; likert index)
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Drop-off rates (response indicating ‘yes’ in consent but not completing the survey)
[Phone only] - rates of answering first call attempts.
[SMS only] - rates of completing up to *m4_2*
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Each participant is invited to complete a survey on COVID-19 via a respective remote survey mode and according to the randomly assigned features described in treatment conditions below. The survey asks questions regarding COVID-19 in Kenya amongst a low-income urban population, and the substantive content covers economic, psychological, and physical effects of COVID and a set of basic economic preference outcomes. This trial is primarily concerned with meta-outcomes of the survey rather than substantive outcomes.

The questions and script for the survey are the same across treatment conditions, but small features of the survey design are varied to identify treatment effects. The following sets of treatments are randomly assigned in a cross-cutting/blocked manner such that each survey invite will include a specific Mode (a or b, described in detail below), Incentive (a, b, or c), Survey pre-communication (a or b), and Order (a or b) according to assignment. Randomization will be blocked by treatment to maximize power for identifying primary effects and for interaction effects. In some cases, treatment blocks are deliberately unequal to account for budgetary constraints and low projected completion rates, but in such cases the sample is expanded to compensate for losses in power.

Outcomes evaluated include meta-survey outcomes, such as consent and survey completion (these are analysed across the set of treatments 1-3 outlined below) and also an analysis of data quality and of substantive outcomes but as they vary according to survey design features (according to treatment sets 1 and 4 below).

Treatment sets are as follows.

1. Mode
Participants are assigned to complete the survey using the following modes and platforms with a Phone: SMS assignment ratio at 1:3. Each list of participants is also randomly assigned across days and times so as to create the maximum possible overlap in implementation times across modes.

a. Phone survey
Survey conducted over phone by an enumerator using SurveyCTO as a data capture tool.

b. SMS
Survey conducted over SMS through the Telerivet SMS poll platform.


2. Incentives
Participants are assigned to receive one of the below incentive amounts, conditional on completion of the 4th module in the survey (approximately 60% of questions), in the ratio of 6:4:3 for 300Ksh: 150 Ksh: 50 Ksh. In each of the below treatments, the amount is communicated in the initial script in the same way across modes during consenting.

a. Micro-incentive/reimbursement: Ksh 50 incentive

b. Reasonable incentive for length of survey: Ksh 150

c. Higher than average incentive for length of survey: Ksh 300


3. Survey Pre-communication
Each participant is randomly assigned to receive the following pre-communication regarding the survey in equal proportions. Please see here for a more detailed protocol.

a. No SMS
Respondents do not receive an SMS. The survey itself is the first communication they receive.

b. Plain SMS warning about the survey.
Respondents receive an SMS on the morning of or morning prior to the survey to indicate that the respondent will be invited to participate in a survey. The script is as follows:


4. Order Effects
The survey includes 2 modules which vary in position across the surveys 7 modules,, each with 1) a likert question regarding COVID sentiment, 2) a continuous response preference measure, and 3) a data quality measure. These modules are numbered 2 and 6. Each will be randomly assigned to be either early in the survey or late in the survey, as shown below:

a. Order 1
Module 2 → 6: The order of the survey will be -
Module 1
Module 2
Module 3, 4, 5
Module 6
Module 7

b. Order 2
Module 6 → 2: The order of the survey will be -
Module 1
Module 6
Module 3, 4, 5
Module 2
Module 7
Experimental Design Details
Randomization Method
STAT 14, block randomization across each treatment set.
Randomization Unit
Individual.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
NA
Sample size: planned number of observations
2912
Sample size (or number of clusters) by treatment arms
SMS: 2184
Phone: 728

Incentive 1: 1344
Incentive 2: 896
Incentive 3: 672

Pre Communication 1: 1456
Pre Communication 2: 1456

Order 1: 1456
Order 2: 1456
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
AMREF
IRB Approval Date
2020-05-01
IRB Approval Number
P797/2020
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS