x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Minnesota COVID-19 Testing
Last registered on September 28, 2020

Pre-Trial

Trial Information
General Information
Title
Minnesota COVID-19 Testing
RCT ID
AEARCTR-0006278
Initial registration date
September 17, 2020
Last updated
September 28, 2020 4:01 PM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
Harvard Kennedy School
Other Primary Investigator(s)
PI Affiliation
Massachusetts Institute of Technology
PI Affiliation
Massachusetts Institute of Technology
PI Affiliation
Massachusetts Institute of Technology
PI Affiliation
Stanford University
PI Affiliation
Yale University
PI Affiliation
Harvard University
Additional Trial Information
Status
On going
Start date
2020-09-28
End date
2021-06-30
Secondary IDs
OSF Registration: DOI 10.17605/OSF.IO/ZBRF9
Abstract
In the United States, recent statistics show that African American and Latinx communities bear a disproportionate burden from COVID-19. Reaching vulnerable and underserved populations is therefore crucial to combating the disease. However, most public messaging campaigns are not targeted toward underserved communities and don't address fears of social stigma, mistrust in the healthcare system, or concerns about immigration status.

The goal of this project is to help the state of Minnesota understand why individuals are not getting tested and potentially identify trusted individuals or organizations that could be used in follow-up work to send messages. To do so, we are deploying flyers through 10 Twin City area food shelves and potentially through public housing units with information on how to answer an online questionnaire.

This provides us with an opportunity to study who answers surveys and why - and what questions are particularly sensitive. This is of general interest to academicians and policymakers alike.

According to Meyer, Mok, and Sullivan (2015) the quality of household surveys is in decline, for three main reasons. First, households have become increasingly less likely to answer surveys at all (unit nonresponse). Second, those that respond are less likely to answer certain questions (item nonresponse). Third, when households do provide answers, they are less likely to be accurate (measurement error). This is important since household surveys help to estimate the employment rate, healthcare needs and of course the census determines resources/representation.

We focus on the first two issues of unit and item nonresponse, which is not random across the population and thus could lead to nonresponse bias. Griffin (2002) found that census tracts with predominantly Hispanic or Black residents had significantly lower response rates to the American Community Survey as compared to the response rates in predominantly white tracts. Similarly, Maitland et al. (2017) found that response rates to the Health Information National Trends Survey (HINTS) were lower in areas with higher levels of Hispanic and minority residents.

We hypothesize that financial incentives may encourage unit response; conversely, a close association with the government may discourage response. To test these hypotheses, we plan to cross-randomize the incentive amount offered and the emphasis placed on government involvement in the study on flyers advertising the baseline survey. Individuals will see either a) a 10 dollar incentive, or b) a 20 dollar incentive; and either a) messaging that emphasizes government involvement in the study, or b) messaging that emphasizes the involvement of academic researchers. Flyers will be randomized at the foodshelf-day level.

To test what affects item non-response on potentially sensitive questions, such as questions which ask for health information, we hypothesize that ethical framing may encourage individuals to answer questions. This takes two forms --- the deontological (or duty based) frame, and the consequential (or cost-benefit) frame. Moreover, knowing others feel the same way (regarding the obligation or benefits of providing health information) may amplify motivation. Finally, there is the possibility that emphasizing the importance of ethnic and racial disadvantage associated with COVID-19 outcomes may be important for improving item non-response on sensitive questions.

Upon completion of the demographic module of the survey but prior to starting several potentially sensitive survey modules, individuals will see a message that either a) emphasizes the public health benefits of answering the survey questions (cost-benefit frame); b) emphasizes an individual's responsibility to their community (duty frame); c) emphasizes the disproportionate impact of COVID-19 on certain ethnic and racial groups; or d) provides no messaging. Messaging content will be randomized at the individual level.
External Link(s)
Registration Citation
Citation
Alsan, Marcella et al. 2020. "Minnesota COVID-19 Testing." AEA RCT Registry. September 28. https://doi.org/10.1257/rct.6278-1.3.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
We plan to cross randomize the incentive amount offered and the emphasis placed on government involvement in the study on the flyers advertising the baseline survey. Individuals will see either a) a 10 dollar incentive, or b) a 20 dollar incentive; and either a) messaging that emphasizes government involvement in the study, or b) messaging that emphasizes the involvement of academic researchers.

Flyers will be randomized at the food shelf-day level. To test what affects item non-response on potentially sensitive questions, such as those asking for health information - we hypothesize that ethical framing may encourage people to answer questions. This takes two forms, the deontological (or duty based) vs. the consequential (or cost-benefit frame). Moreover, knowing others feel the same way (regarding the obligation or benefits of providing health in-formation), may amplify the motivation. Finally, there is the possibility that providing information on norms of response might be helpful. Prior to seeing several potentially sensitive survey blocks, individuals will see either a) messaging that emphasizes the public health benefits of answering the survey questions (cost-benefit frame); b) messaging that emphasizes an individual’s responsibility to their community (duty frame); or c) no messaging. Messaging content will be randomized at the individual level.
Intervention Start Date
2020-09-28
Intervention End Date
2020-10-10
Primary Outcomes
Primary Outcomes (end points)
Our primary outcomes of interest are:
- whether higher monetary incentives increase unit response
- whether a government frame reduces unit response
- whether incentives and a de-emphasis on government are complements or substitutes in increasing unit response
- which frame/incentive combinations diffuse most from the initial point of distribution through-out the community
- how characteristics of individuals nudged that respond differ by treatment (i.e. frames and incentive structures)
- whether we can extrapolate using statistical methods and the randomized estimates to obtain population level estimates and approximate missing mass in government surveys
- whether various ethical and (racial/ethnic) acknowledgement frames improve item nonresponse (including quality)
- how item nonresponse frames interact with variation in the composition of respondents induced through the randomized incentives/frames
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
1 Recruitment and Sampling
We are recruiting subjects through 10 food shelves in the Twin Cities area. Food shelves will place flyers (see Figure 1 and 2) advertising the baseline survey in bags of food and prepackaged meals that are distributed to food shelf users. We require all participants to be age 18 or older and to speak English or Spanish. We will use a touchless delivery systems to drop off and redeem flyers on a daily basis. Our target sample size is 1000 survey responses. We may stay in the field (i.e. oversample) food shelves that tend to serve minority individuals so that we can increase their representation in the survey.

2 Experimental Protocols
2.1. Randomization of Flyers (Unit Response)
(a) Randomly assign survey advertising flyers at the food shelf-day level.
(b) Food shelves distribute the randomized daily flyer via food bags and prepackaged meals

2.2. Recruitment and Baseline Survey
(a) For all potential study participants, elicit a preference for English or Spanish. Those with a preference for Spanish will be given a Spanish language consent form and survey questions.
(b) Individuals provide consent.
(c) Collect demographic information

2.3. Randomization of messaging content before a sensitive survey modules (Item Response)
(a) After completing the demographics survey module, participants are randomly assigned to one of four ethical motivation messaging options; deontological, consequential, acknowledgement of racial inequities, or nothing.
(b) Participants are shown the same randomly assigned message prior to each sensitive survey module.
(c) Participants answer in survey modules 2-5 on media attitudes, health and COVID-19, and discrimination in healthcare.

2.4. Participants answer debrief questions and are prompted to get tested for COVID-19 if they have any officially recognized symptoms. Participants are also notified that they may share the survey with family and friends

2.5. Electronic gift cards are texted to participants who complete the survey to compensate them for their time.

3 Randomization: We will randomly assign food shelf-dates to the four different treatments using Stata 14.2 for the unit nonresponse outcomes. Robust standard errors will be used as the level of outcome is the same as the unit of randomization. For item nonresponse outcomes, we will randomize at the individual level in Qualtrics.

4 Survey Messaging Content
One of the following message options will be randomly assigned to participants after completing the demographics section of the survey module.
4.1.No framing– No message displayed.
4.2.Consequential framing– "Answering the questions in this survey is an easy way that you can help improve the public health response to COVID-19 in your community."
4.3.Deonteological framing– "It is important for everyone to do their part to protect their community during the COVID-19 pandemic."
4.4.Acknowledgement of racial inequities– "COVID-19 is affecting everyone, but is hitting African American and Latinx communities particularly hard."
Experimental Design Details
Not available
Randomization Method
We will use Stata for unit response randomization and computer randomization via Qualtrics for item response randomization.
Randomization Unit
1. Unit Response Randomization: randomization at the food shelf-day level.
2. Item Response Randomization: individual-level randomization.
Was the treatment clustered?
Yes
Experiment Characteristics
Sample size: planned number of clusters
1. Unit Response Randomization: 74 food shelf-days.
2. Item Response Randomization: individual-level randomization.
Sample size: planned number of observations
1,000 individuals.
Sample size (or number of clusters) by treatment arms
250 No framing, 250 Consequential framing, 250 Deonteological framing, 250 Acknowledgement of racial inequities.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Our power calculations for the effect of non-government framing on unit response compute minimum detectable effects (MDEs) conditional on given levels of control group unit response (i.e. take-up), which we vary from 10% to 30%. We also vary the amount of intra-cluster correlation (ICC) as our unit of randomization is the foodshelf-date. Our calculations assume that our data features 74 food shelf-dates, evenly allocated to two framing arms --- government and researcher --- with an average of 100 flyers distributed per food shelf-date. Fixing government framing as the control group and control take-up at 10%, our calculations imply a MDE of 2.07 percentage points (or 20.7% greater than the control group mean) in the absence of intra-cluster correlation. Holding control take-up constant at 10%, the required MDE rises to 3.68 percentage points (+36.8%) given 2% intra-cluster correlation and to 7.46 percentage points (+74.6%) given 10% ICC. If take-up in the control group is 30%, the required MDE falls to 3.07 percentage points (or 10.2% greater than the control mean) in the absence of ICC, 5.34 percentage points (+17.8%) given 2% ICC, and 10.35 percentage points (+34.5%) given 10% ICC. In computing MDEs for item response, we assume our target sample size of 1000 survey respondents is evenly divided across four messaging arms. Given 70% item response in the control group of no messaging, we would require a 10.77 percentage point effect size (or 15.38% higher than the control mean) for treatment effects to be detectable. As item response in the control group rises, MDEs fall: given 90% item response in the no messaging group, the MDE is 6.31 percentage points (+7.01% from the control mean).
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Harvard University
IRB Approval Date
2020-09-14
IRB Approval Number
IRB20-1444
Analysis Plan
Analysis Plan Documents
Pre Analysis Plan

MD5: 49bdef9e47e848049e8220b96241e3f0

SHA1: ddf670b628bee5f80435478b9143d4559db299cd

Uploaded At: September 24, 2020