Field experiments improve the credibility of economic research and we observe an ever-increasing number of experiments in developing countries. These studies are often based on survey data. At the same time, research on the ethical aspects of collecting survey data in developing countries remains scarce. There is, however, a responsibility to engage with the ethical requirements of survey data collection. In this paper, we investigate informational constraints to consent of potential survey participants. First, we test whether survey participants are sufficiently informed about what happens to their data and their rights. Further, we experimentally test whether an interactive, audio-visual supported approach can improve how well informed they are. In this approach, the content of the consent form is presented in a structured dialog and illustrated with a short video. We further investigate implications on response behavior and data quality.
External Link(s)
Citation
Avdeenko, Alexandra and Matthias Stelter. 2020. "How Informed is Consent? A Field Experiment." AEA RCT Registry. November 30. https://doi.org/10.1257/rct.6829-1.0.
We will randomly vary an alternative approach of presenting the consent form, combining an audio-visual component with a scripted process, in a survey data collection.
Intervention Start Date
2020-12-01
Intervention End Date
2021-04-01
Primary Outcomes (end points)
Three main outcomes of interest: (1) ethical aspects related to different levels of being informed, (2) data quality, and (3) external validity. With our survey instrument we try to measure objective knowledge and subjective understanding of the study purpose, the voluntary nature of participation, the data confidentiality, and rights w.r.t. data protection.
Primary Outcomes (explanation)
A. Knowledge about Rights and about Implications of Participating Main outcomes:
Summary score: Rights w.r.t. data protection Summary score: Purpose Summary score: Voluntary participation Summary score: Data confidentiality Overall knowledge summary score (mean of all seven one-item scores) Indicator for sufficiently informed respondents (Indicator for when the Overall knowledge summary score is above 70%. Overall subjective understanding summary score (mean of all 4 one-item scores) Indicator for “I understood this well” OR “I understood this very well” to: “The purpose of this survey.” Indicator for “I understood this well” OR “I understood this very well” to: “My participation in the interview is fully voluntary.” Indicator for “I understood this well” OR “I understood this very well” to: “How the confidentiality of my information is ensured.” Indicator for “I understood this well” OR “I understood this very well” to: “My rights with respect to data protection and storage.” To analyze the relation of subjective and actual understanding of respondents:
Y5.i: Respondent type w.r.t. purpose: high actual and subjective understanding; low actual but high subjective under- standing; low actual and low subjective understanding; high actual but low subjective understanding Y5.ii: Respondent type w.r.t. voluntary nature: high actual and subjective understanding Y5.iii: Respondent type w.r.t. confidentiality: high actual and subjective understanding Y5.iv: Respondent type w.r.t. rights: high actual and subjective understanding B. Data Quality
Y6.i: Indicator for whether the (guardian of the) potential respondent gave consent to be inter- viewed by providing oral consent For robustness: Y6.ii: Indicator for whether the (guardian of the) potential respondent gave consent (and if needed assent) to be interviewed by providing oral consent
Y7.i: Share of all “refuse to answer” replies among sensitive questions For robustness: Y7.ii: Share of all “refuse to answer” replies among all questions Y7.iii: Share of all “refuse to answer” and “don’t know” replies among all questions Y7.iv: Share of all “refuse to answer” and “don’t know” replies among sensitive questions A question is considered to be sensitive, if at least one respondent refused to answer the question among all respondents. Note that this outcome can be measured for those who gave consent and thus will be analyzed for respondents only. Y8.i: Share of inconsistencies of all potential inconsistencies Y8.ii: Number of modules of with non-differentiation responses (“straightlined” responses) Y8.iii: Interview duration after consent Y8.iv: Interview speed (time per question asked, i.e. duration of interview/ number of questions after consent) C. Sample Composition Y9.i: Respondent’s age Y9.ii: Respondent’s sex Y9.iii: Respondent’s education
Further, due to limitation in questionnaire space, we chose to only pose all questions required to measure outcomes such as how informed to only 12%, the 6% in the Audio-visual Supported and Scripted Interactive Consent Form group and 6% in the control group, while the remaining 88% only see two questions chosen at random.
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
We plan to implement the experiment alongside an already planned data collection in rural Pakistan. Thus potential effects are established in a real rather than a laboratory setting. In more detail, we will have a sample of over 7500 respondents, whereby more than half of participants will need to provide assent (because respondent is under age of 18) and a consent will be provided by a guardian. Since the study is an independent add-on to a different study, we want to minimize any potential influence on the main survey. Therefore the majority will be randomize on-site to the control group (50%), while the intervention will be assigned to the Audio-visual Supported Consent Form treatment (44%), and only some respondents will be assigned to the full Audio-visual Supported and Scripted Interactive Consent Form treatment (6%).
Experimental Design Details
Not available
Randomization Method
coded in the survey instruments (randomization on-site)
Randomization Unit
individual
Was the treatment clustered?
No
Sample size: planned number of clusters
no clusters
Sample size: planned number of observations
>7500 individuals
Sample size (or number of clusters) by treatment arms
Control group (50%)
Treatment arm 1: Audio-visual Supported Consent Form treatment (44%)
Treatment arm 2: full Audio-visual Supported and Scripted Interactive Consent Form treatment (6%)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Depending on what we expect, we are powered to detect changes of less than 5% in consent rates and can detect changes in our measure of objective understanding of 0.2. We therefore deem the study sufficiently powered, considering that these calculations are conservative. Changes of much less than 5% in consent rates do not seem to be too relevant as well as changes in understanding of less than 0.2 (on a scale from -1 to 1). A more detailed calculation is presented in the registered PAP.