Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment

Last registered on August 08, 2019

Pre-Trial

Trial Information

General Information

Title
Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment
RCT ID
AEARCTR-0002299
Initial registration date
July 02, 2017

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 02, 2017, 11:04 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
August 08, 2019, 6:37 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Tulane University

Other Primary Investigator(s)

PI Affiliation
Tulane University

Additional Trial Information

Status
Completed
Start date
2017-03-02
End date
2018-06-01
Secondary IDs
Abstract
This pre-analysis plan and methodology draft describes how we will design and implement a field experiment – a resume correspondence study – to measure the discrimination in hiring faced by Indigenous Peoples in the United States (American Indians, Alaska Natives, and Native Hawaiians). We create realistic resumes of men and women applying for common entry-level jobs (retail sales, cook, server, janitor, security). We send employers resumes that either signal that the applicant is Indigenous or white, with all other resume features being the same on average. We further signal that some of the Native American applicants grew up on an Indian reservation to measure to what extent employers penalize Native Americans who grew up on Indian reservations. After discussing how we created the resumes and applied for jobs, we discuss ways that we have pre-specified our analysis to prevent concerns such as dating mining.
External Link(s)

Registration Citation

Citation
Button, Patrick and Brigham Walker. 2019. "Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment." AEA RCT Registry. August 08. https://doi.org/10.1257/rct.2299-4.0
Former Citation
Button, Patrick and Brigham Walker. 2019. "Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment." AEA RCT Registry. August 08. https://www.socialscienceregistry.org/trials/2299/history/51453
Experimental Details

Interventions

Intervention(s)
See the pre-analysis plan for details. To summarize, we sent employers who post entry-level job advertisements two resumes, one of which is for a white applicant and one of which is for an Indigenous applicant. We track which resumes get "callbacks" (interview requests or other positive responses) by race. We do not otherwise intervene or contact with employers other than emailing these resumes.
Intervention Start Date
2017-03-02
Intervention End Date
2017-12-18

Primary Outcomes

Primary Outcomes (end points)
Callbacks
Primary Outcomes (explanation)
We define this in the pre-analysis plan. To summarize, our default measure of callbacks includes explicit interview requests and other positive responses, but also ambiguous responses (e.g., "Please send us more information.")

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We discuss this in-depth in the pre-analysis plan. There are many details which is difficult to summarize here. But to summarize some main points, we send each job opening two resumes, one white and one Indigenous, in a random order with an eight hour delay between resumes. All aspects of the resumes are the same on-average except for a signal of Indigenous status (or no signal).
Experimental Design Details
Randomization Method
Visual Basic for Applications code was used to randomly generate the resumes and determine the order that the resumes are sent out.
Randomization Unit
Randomization occurs at the job opening level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
See the pre-analysis plan for a complete discussion, with footnotes. Our planned number of clusters (job openings) is at least 4,211.

A key aspect of this plan was to conduct a power analysis, based on previous studies, to determine how many observations would be necessary to detect meaningful differences in callback rates between major resume types. Based on previous studies, we saw differences of about three percentage points in the interview request rate to be possible, and we would like to be able to detect a difference of at least this magnitude between white and Indigenous applicants. Based on our calculations, we anticipated needing to apply to 4,211 jobs (8,422 applications) to detect differences in callback rates between white and Indigenous applicants of at least three percentage points.

Of course, we would ideally like to collect more data than this to be able to have a higher power, detect differences smaller than three percentage points, or to detect other mediators of discrimination (e.g., reservation upbringing, city demographics, gender, occupation) with enough precision. Our funding sources are also variable, such that we may get enough funding to go beyond this number. What we have decided is that for the main comparison of white versus Indigenous, we commit to conducting an analysis with the first 8,422 observations in addition to the resulting number of observations we ultimately get, if we exceed this amount, and discussing any differences between these two sets of results. This strikes a balance between the risk that a researcher could collect data until estimates are significant and the more salient cost of tossing out useful data that could increase precision simply to avoid this risk.
Sample size: planned number of observations
Our planned number of observations is twice the number of clusters, since we send two resumes per cluster (job opening).
Sample size (or number of clusters) by treatment arms
We have similar different, more minor, "treatments", such as making some Native Americans from Indian Reservations. It is hard to summarize this here but all these aspects are discussed in-depth in the pre-analysis plan.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
The minimum detectable effect size is a three percentage point difference in callback rate, assuming the sample size of 4,211 jobs and under the assumptions made in the power analysis.
IRB

Institutional Review Boards (IRBs)

IRB Name
Tulane University Institutional Review Board
IRB Approval Date
2017-11-21
IRB Approval Number
1085352
IRB Name
Tulane University Institutional Review Board
IRB Approval Date
2016-06-06
IRB Approval Number
16-910329UE
Analysis Plan

Analysis Plan Documents

Pre-Analysis+Plan+-+Indidenous+Discrimination+Study.docx

MD5: a74cd029b3741945667e738f3b89061b

SHA1: beedaa57226f8fa9bfe279efd21c7c284f2c9a1b

Uploaded At: August 08, 2019

Pre-Analysis Plan and Methodology for: “Employment Discrimination against Indigenous Peoples in the United States: Evidence from a Field Experiment”

MD5: dc3d3d1e7dbd69051e5295d13282c8fa

SHA1: c31f3927f6bb1a70aa649cabf6eb5e3d96629331

Uploaded At: July 02, 2017

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
December 15, 2017, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
December 15, 2017, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
6,758 job advertisements
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
13,516 job applications
Final Sample Size (or Number of Clusters) by Treatment Arms
6,758 Indigenous applicants ("treatment"), 6,758 non-Indigenous applicants ("control")
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials