Experimental Design Details
Our empirical strategy rests on randomly assigning respondents to answer identical questions in-person and over the phone across time, varying the order of the mode of interview.
As part of a separately funded methodological experiment on the mixed-mode measurement of agricultural labor inputs, the team distributed phones and sim cards to 995 households part of the Nigeria General Household Survey (GHS) Panel Survey (located in 106 randomly selected agricultural enumeration areas). The GHS is a nationally representative household survey fielded every 2-3 years by the Living Standards Measurement Study unit of the World Bank in collaboration with the Nigeria Bureau of Statistics (NBS). The survey includes two in-person visits, one conducted in August/September 2023 and the second one conducted in February/March 2024. The proposed experiment will be fielded concurrently with the second in-person visit and involves scheduling phone interviews around the time of the in-person interview and administering a set of identical questions that can then be compared across survey modes.
We are interested in mode effects proper, that is, differences in measured outcomes due to differences in the survey mode. To isolate the effect of survey mode, our empirical strategy addresses a number of related but distinct biases and potential confounders.
First, sampling related confounders. If the samples of households interviewed in the phone and in-person surveys differ, then we could be picking up a sampling effect rather than a mode effect. Specifically, phone survey samples are sometimes subject to under-coverage and selective non-response, which can lead to phone and in-person samples differing systematically. By distributing phones to 995 households, we are able to overcome these potential differences.
Second, if the respondents of the in-person and phone surveys are different, we could be picking up a respondent selection effect rather than a true survey mode effect. Our survey design controls for respondent selection effects within the household by targeting the same respondent in both survey modes. Respondent selection for the in-person survey is dictated by the specific data and logistical requirements of the GHS panel survey. To increase the odds of interviewing the same respondent over the phone that answered the respective questions during the in-person interview, we will select two household members 18 years or older to answer the phone survey in separate interviews.
Third, if the phone and in-person interviews were done a long time apart, differences in measured outcomes could be due to real changes in outcomes over time, rather than mode effects. We therefore plan to conduct the phone and in-person interviews at most a week apart.
Fourth, there are some possible confounders related to the order in which phone and in-person surveys are fielded, such as anchoring effects. Anchoring effects would arise when respondents adjust their answers during the second scheduled interview to be consistent with answers they previously gave as part of the first interview (irrespective of whether the phone or in-person interview comes first). We therefore randomize the order of the phone and in-person interviews to rule out such anchoring effects. Specifically, we split the sample of 937 households into (i) a group interviewed first over the phone and then in-person (Treatment Group 1) and (ii) a group interviewed first in-person and later over the phone (Treatment Group 2). We are thus able to restrict comparisons to first-time respondents who have had no previous interview to anchor their answer to.
Respondents in Treatment Group 1 will be first interviewed over the phone and then shortly after will be interviewed again in-person, answering the same set of questions. Respondents in Treatment Group 2 will be first interviewed in-person and then shortly after over the phone, answering the same set of questions. The remaining sample of households of the Nigeria GHS-Panel which did not receive phones will be interviewed alongside the in-person interviews of the treatment groups.
Comparing Phone Interview of Treatment Group 1 with in-person Interview of Treatment Group 2 provides a clean estimate of the survey mode effect.
Next to our main specification, the experimental design further allows testing for other effects that often affect phone and mixed-mode survey in practice.
• Phone interview (T1) vs. in-person interview (T1) and Phone interview (T2) vs. in-person interview (T2): Each respondent is interviewed once over the phone and once in-person (in random order), allowing us to compare answers given under different survey modes for the same respondent. This within-person design controls for respondent selection effects implicitly. However, it may be susceptible to anchoring effects.
• Phone interview (T1) vs. Phone interview (T2): This allows to isolate any anchoring effects as only T2 would have been previously interviewed.
• Phone interview (T1) vs. In-person interview (T2 + remaining GHS sample): Uses the GHS sample not part of the phone distribution as an additional control group to increase statistical power. However, confounding would be possible through the fact that the additional control sample did not receive a phone.
• In-person interview (T2) vs. in-person interview (remaining GHS sample): Allows to identify the effect of phone receipt as part of the experiment.