Cross-Cultural Biases and Inattention

Last registered on September 03, 2021


Trial Information

General Information

Cross-Cultural Biases and Inattention
Initial registration date
September 02, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 03, 2021, 5:28 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

Busara Center for Behavioral Economics

Other Primary Investigator(s)

PI Affiliation
PI Affiliation
Busara Center for Behavioral Economics

Additional Trial Information

In development
Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Inattention (or careless responding) is prevalent in self-administered surveys, posing a threat to data quality and biasing results. Risk of low-attention is especially high in remote and self-administered settings, owing to the lack of direct monitoring and supervision by lab staff. As remote and self-administered surveys are increasingly preferred for conducting research, the risk of inattention becomes increasingly prevalent.

In this context, we will test the use of a set of established survey inattention measures (Craig & Meade, 2012; Berinsky et al. 2019), tailored to the Kenyan population but with a generalizable framework that can be used in similar contexts. These will allow us to detect low survey attention in a wider cross-cultural study while expanding the literature on survey inattention in remote surveys. Literature suggests that respondents tend to respond differently on a range of substantive indicators (Bowling et al. 2016). Removing inattentive respondents from a dataset because of data quality therefore poses a risk to the generalizability of estimates. Faced with the issue of inattention, we test two basic interventions in order to reduce survey inattention. These are:

- A small financial incentive for answering a question in the survey correctly
-The placement of an obvious attention check early in the survey (Hauser & Schwarz, 2015)

Implementing the above provides the opportunity to measure the sensitivity of attention to interventions specifically designed to manipulate it. With this, we will be able to derive recommended survey design features to reduce inattention in mobile surveys.

External Link(s)

Registration Citation

Mughogho, Winnie, Nicholas Owsley and Chang Tang. 2021. "Cross-Cultural Biases and Inattention." AEA RCT Registry. September 03.
Experimental Details


Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Index of attention
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Our aim is to test interventions aimed at increasing survey attention. To that end we will randomly assign participants to either a) be offered a ‘bonus’ (small financial incentive) for completing certain questions correctly, or not, and to b) complete an “instructional manipulation check'' at the beginning or end of the survey on cross-cultural biases. This will generate a 2x2 design, with 4 treatment cells, allowing us to identify whether each intervention, or their interaction, affects survey inattention.
Experimental Design Details
Randomization Method
Randomization done in office using statistical software
Randomization Unit
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
1500 individuals from low income population; 500 students from universities in Kenya.
Sample size (or number of clusters) by treatment arms
Group 1 (Early IMC, Bonus Incentive); Group 2 (Early IMC, No Bonus Incentive); Group 3 (Late IMC, Bonus Incentive); Group 4 (Late IMC, No Bonus Incentive)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number
Analysis Plan

Analysis Plan Documents

Attention Pre-Analysis Plan

MD5: aaba421a08d275cbd44aeb007a013c93

SHA1: 67be61f212b2382548a83d57f798062c1539adc2

Uploaded At: September 02, 2021


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials