Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment
Last registered on February 06, 2018


Trial Information
General Information
Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment
Initial registration date
July 02, 2017
Last updated
February 06, 2018 11:15 PM EST
Primary Investigator
Tulane University
Other Primary Investigator(s)
PI Affiliation
Tulane University
Additional Trial Information
On going
Start date
End date
Secondary IDs
This pre-analysis plan and methodology draft describes how we will design and implement a field experiment – a resume correspondence study – to measure the discrimination in hiring faced by Indigenous Peoples in the United States (American Indians, Alaska Natives, and Native Hawaiians). We create realistic resumes of men and women applying for common entry-level jobs (retail sales, cook, server, janitor, security). We send employers resumes that either signal that the applicant is Indigenous or white, with all other resume features being the same on average. We further signal that some of the Native American applicants grew up on an Indian reservation to measure to what extent employers penalize Native Americans who grew up on Indian reservations. After discussing how we created the resumes and applied for jobs, we discuss ways that we have pre-specified our analysis to prevent concerns such as dating mining.
External Link(s)
Registration Citation
Button, Patrick and Brigham Walker. 2018. "Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment." AEA RCT Registry. February 06. https://www.socialscienceregistry.org/trials/2299/history/25604
Experimental Details
See the pre-analysis plan for details. To summarize, we sent employers who post entry-level job advertisements two resumes, one of which is for a white applicant and one of which is for an Indigenous applicant. We track which resumes get "callbacks" (interview requests or other positive responses) by race. We do not otherwise intervene or contact with employers other than emailing these resumes.
Intervention Start Date
Intervention End Date
Primary Outcomes
Primary Outcomes (end points)
Primary Outcomes (explanation)
We define this in the pre-analysis plan. To summarize, our default measure of callbacks includes explicit interview requests and other positive responses, but also ambiguous responses (e.g., "Please send us more information.")
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
We discuss this in-depth in the pre-analysis plan. There are many details which is difficult to summarize here. But to summarize some main points, we send each job opening two resumes, one white and one Indigenous, in a random order with an eight hour delay between resumes. All aspects of the resumes are the same on-average except for a signal of Indigenous status (or no signal).
Experimental Design Details
Randomization Method
Visual Basic for Applications code was used to randomly generate the resumes and determine the order that the resumes are sent out.
Randomization Unit
Randomization occurs at the job opening level.
Was the treatment clustered?
Experiment Characteristics
Sample size: planned number of clusters
See the pre-analysis plan for a complete discussion, with footnotes. Our planned number of clusters (job openings) is at least 4,211.

A key aspect of this plan was to conduct a power analysis, based on previous studies, to determine how many observations would be necessary to detect meaningful differences in callback rates between major resume types. Based on previous studies, we saw differences of about three percentage points in the interview request rate to be possible, and we would like to be able to detect a difference of at least this magnitude between white and Indigenous applicants. Based on our calculations, we anticipated needing to apply to 4,211 jobs (8,422 applications) to detect differences in callback rates between white and Indigenous applicants of at least three percentage points.

Of course, we would ideally like to collect more data than this to be able to have a higher power, detect differences smaller than three percentage points, or to detect other mediators of discrimination (e.g., reservation upbringing, city demographics, gender, occupation) with enough precision. Our funding sources are also variable, such that we may get enough funding to go beyond this number. What we have decided is that for the main comparison of white versus Indigenous, we commit to conducting an analysis with the first 8,422 observations in addition to the resulting number of observations we ultimately get, if we exceed this amount, and discussing any differences between these two sets of results. This strikes a balance between the risk that a researcher could collect data until estimates are significant and the more salient cost of tossing out useful data that could increase precision simply to avoid this risk.
Sample size: planned number of observations
Our planned number of observations is twice the number of clusters, since we send two resumes per cluster (job opening).
Sample size (or number of clusters) by treatment arms
We have similar different, more minor, "treatments", such as making some Native Americans from Indian Reservations. It is hard to summarize this here but all these aspects are discussed in-depth in the pre-analysis plan.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
The minimum detectable effect size is a three percentage point difference in callback rate, assuming the sample size of 4,211 jobs and under the assumptions made in the power analysis.
IRB Name
Tulane University Institutional Review Board
IRB Approval Date
IRB Approval Number
IRB Name
Tulane University Institutional Review Board
IRB Approval Date
IRB Approval Number
Analysis Plan
Analysis Plan Documents
Pre-Analysis Plan and Methodology for: “Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment”

MD5: dc3d3d1e7dbd69051e5295d13282c8fa

SHA1: c31f3927f6bb1a70aa649cabf6eb5e3d96629331

Uploaded At: July 02, 2017

Post Trial Information
Study Withdrawal
Is the intervention completed?
Is data collection complete?
Data Publication
Data Publication
Is public data available?
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers