NEW UPDATE: Completed trials may now upload and register supplementary documents (e.g. null results reports, populated pre-analysis plans, or post-trial results reports) in the Post Trial section under Reports, Papers, & Other Materials.
Incentives for firm transparency in large-scale surveys
Initial registration date
July 04, 2020
July 08, 2020 5:13 PM EDT
University of Mannheim
Other Primary Investigator(s)
Additional Trial Information
In this study we designed five incentive treatments to evaluate their effectiveness for firms to provide information in an large-scale survey. A novel method is that we invite a part of the firms to advertise their business on our website as an incentive. In another treatment, we use the authority of an important institution that explicitly calls for participation. The third treatment provides firms industry reports as an incentive to participate. All these treatments are based on the baseline treatment, where we only invite firms to participate in a short message without any additional incentive. In the final treatment we extend this short message with information about the research program but no additional incentives. This is related to the letter treatments in Dwenger et al. (2016), De Neve et al. (2019), and Chetty et al. (2014).
See experimental design and attached documents
Intervention Start Date
Intervention End Date
Primary Outcomes (end points)
i) response rate
ii) completion rate
iii) agreement to be available for follow-up studies
Primary Outcomes (explanation)
We construct response rates, completion rates, and the aggregate number of availablity for follow-ups for each of the treatment arms.
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
In a large-scale survey of firms, we vary the way, we approach the firms. We send out five different versions of invitation letters.
i) Letter with link to support letter of prominent institution
ii) Letter, where we highlight that we will provide firms with industry reports
iii) Letter, where we allow firms to advertise on our webpage
iv) Long letter
v) Short letter
Experimental Design Details
We generate a randomized sequences of integers using atmospheric noise on a computer. The realization indicates the day of the week (1-5) starting with Monday, which is assiged to the five treatments. In the following weeks, the sequence is alternated by decrementing, i.e. if day 3 had been assigned in week 1, in week 2 day 2 is assigned. Decrements to day 0 are appended as day 5.
Was the treatment clustered?
Sample size: planned number of clusters
The Treatments are not clustered. See below for the number of observed firms.
Sample size: planned number of observations
We approach approximately 600,000 firms and invite them to participate in our survey. We expect a response rate between 0.5 an 5%. For example, a response rate of 1% would imply that we have 6,000 independent firm-level observations.
Sample size (or number of clusters) by treatment arms
Even distribution of observations across the five treatment arms.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
INSTITUTIONAL REVIEW BOARDS (IRBs)