Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Can locally-targeted feedback encourage the use of COVID contact tracing apps?
Last registered on October 06, 2020


Trial Information
General Information
Can locally-targeted feedback encourage the use of COVID contact tracing apps?
Initial registration date
September 26, 2020
Last updated
October 06, 2020 5:54 AM EDT

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
University of Bonn
Other Primary Investigator(s)
PI Affiliation
University of Bonn
PI Affiliation
National University of Singapore
PI Affiliation
University of Bonn, National University of Singapore
Additional Trial Information
In development
Start date
End date
Secondary IDs
One important tool for controlling the COVID-19 pandemic lies in fast and effective detection of individuals who have been in close contact with infected persons. In this context, contact tracing apps such as the German "Corona-Warn-App" (CWA) can play an important role.
Installation and use of the contact tracing app represents a public good: everyone, even those who do not install an app, benefits from faster tracking of suspicious cases. Yet, take-up rate of the CWA remained at only slightly about 20 percent of German population even three months after roll-out. One potential channel to increase the willingness to contribute may be to emphasize the benefits for peers in the immediate social environment, since individuals often behave more prosocially in groups they personally identify with. Controlled laboratory studies further show that social comparison between different groups can lead to greater cooperation in the provision of public goods. However, there has been little empirical research in field setting so far.

In this project, we investigate whether and how targeted feedback on local COVID-19 incidence rates and social comparisons with other regions can increase the willingness to install the CWA. We test this on a large-scale through targeted social media ads. Additional online surveys aim to further inform us about the relevant behavioral mechanisms of the intervention.
External Link(s)
Registration Citation
Chen, Zihua et al. 2020. "Can locally-targeted feedback encourage the use of COVID contact tracing apps?." AEA RCT Registry. October 06. https://doi.org/10.1257/rct.6529-1.1.
Experimental Details
For our main interventions, we deliver ads on Facebook that are targeted on county/city level and provide feedback on the local incidence rates as well as a comparison with other counties/cities in Germany.
Intervention Start Date
Intervention End Date
Primary Outcomes
Primary Outcomes (end points)
Click-through rate of the Facebook ads, which link to the official homepage of the Corona-Warn-App
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Views of ad video as intermediary outcome
Secondary Outcomes (explanation)
Treatment could in essence affect click rates through both the extensive margin (engaging with the ad) and the intensive margin (response to ad content). Video view metrics can help us disentangle the two, at least if selection effects are limited.
Experimental Design
Experimental Design
Targeted (video) ads:
Control group: conventional ad for CWA (from actual Marketing campaign), includes info slogan about effectiveness of ad in stopping infection chains
Treatment 1: feedback on local incidence rate on comparison with rest of state; the video makes the comparison salient and adds an injunctive norm
Treatment 2: like Treatment 1, but additional appeals depending on whether comparison is favorable (maintain status quo) or unfavorable (call for change)

We furthermore add a survey measure of baseline take-up rate in a holdout group of Facebook users through a video ad poll. The poll is included to the contral group video, but there is no link.
Experimental Design Details
Not available
Randomization Method
Randomization through Facebook's A/B testing functionalitiy
For online survey, randomization through computer
Randomization Unit
Randomization on individual level
Was the treatment clustered?
Experiment Characteristics
Sample size: planned number of clusters
We aim for about 3-4 million impressions on Facebook
Online survey: 1200 subjects
Sample size: planned number of observations
same as clusters
Sample size (or number of clusters) by treatment arms
We aim for equal number of observations in each treatment condition
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB Name
German Association for Experimental Economic Research e.V.
IRB Approval Date
IRB Approval Number