Social Norms and Online Survey Participation

Last registered on April 28, 2022

Pre-Trial

Trial Information

General Information

Title
Social Norms and Online Survey Participation
RCT ID
AEARCTR-0009303
Initial registration date
April 24, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 28, 2022, 6:05 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
University of Erfurt

Other Primary Investigator(s)

PI Affiliation
Nuremberg Institute of Technology

Additional Trial Information

Status
On going
Start date
2022-03-15
End date
2023-10-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Researchers often rely on data from voluntary (online) surveys, for instance, to investigate effects on outcomes that are not available in administrative data (e.g., well-being and non-cognitive skills) or to study potential mechanisms that drive effects (e.g., behavioral changes may occur due to an update of individuals’ beliefs). However, survey response rates are often low, which reduces the statistical power of subsequent analyses. In addition, survey participation may be selective, which can affect nonresponse bias. Using a large sample of university students, we investigate if social norms can be used to increase participation rates and to reduce the nonresponse bias in a voluntary online survey.
External Link(s)

Registration Citation

Citation
Brade, Raphael and Robert Jäckle. 2022. "Social Norms and Online Survey Participation." AEA RCT Registry. April 28. https://doi.org/10.1257/rct.9303-1.0
Experimental Details

Interventions

Intervention(s)
The intervention will be implemented as part of the invitations to two online surveys that are conducted in the summer semester 2022 with students of all bachelor’s programs at a university of applied sciences studying towards 27 different degrees. In all experimental groups, students will receive an email invitation and four email reminders asking for their participation in the surveys.
Intervention Start Date
2022-04-25
Intervention End Date
2022-05-29

Primary Outcomes

Primary Outcomes (end points)
Survey participation
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
An increase in survey participation can, but need not, increase survey quality. To assess how survey quality is affected by the treatments and study potential mechanisms, we will also investigate effects along the following dimensions:

1. Nonresponse bias: Nonresponse bias will be assessed by approaches such as analyzing differences in observable administrative background variables between respondents and nonrespondents (such as age, sex, features of the university entrance qualification, and prior academic performance).

2. Quality of survey response: We will further approximate the quality of the survey responses using variables based on dimensions such as answering speed, item-non-response, survey drop-out, and item-non-differentiation.

3. Beliefs: Finally, to study potential mechanisms we will investigate effects on students’ beliefs about the participation rate among all students and the confidence in those beliefs (asked in the surveys).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Students were randomly allocated to a control and seven treatment groups.
Experimental Design Details
Not available
Randomization Method
Blocked randomization using Stata
Randomization Unit
Student-level
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
9,276 students
Sample size: planned number of observations
9,276 students
Sample size (or number of clusters) by treatment arms
Group 1: 1,161
Group 2: 1,162
Group 3: 1,159
Group 4: 1,163
Group 5: 1,153
Group 6: 1,162
Group 7: 1,156
Group 8: 1,160
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
MDES for effects in percentage points range between 1.42 pp and 5.2 pp. MDES for standardized effect sizes range between d=0.046 and d=0.116.
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number