A Large-Scale Resume Audit of the U.S. Labor Market for New College Graduates (2016-2017)

Last registered on January 31, 2024

Pre-Trial

Trial Information

General Information

Title
A Large-Scale Resume Audit of the U.S. Labor Market for New College Graduates (2016-2017)
RCT ID
AEARCTR-0012914
Initial registration date
January 30, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 31, 2024, 1:32 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Auburn University

Other Primary Investigator(s)

PI Affiliation
University of Wisconsin-La Crosse

Additional Trial Information

Status
Completed
Start date
2016-03-15
End date
2017-07-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Large-scale resume audits were conducted in the United States in 2016 and 2017, to straddle the May graduation dates of our fictive applicants. A total of 36,880 resumes were submitted to job advertisements from a constructed job bank. The job categories included were account executive, banking, customer service, finance, insurance, and marketing. Ads requiring certifications or foreign language skills were not included. By saving the text from the job advertisements, we were able to link the ads to the O*Net and American Community Survey, via a machine learning algorithm that matched ads to standard occupation codes.

The study covered the entire US labor market without geographical restrictions. Each job advertisement received four applications, with resumes created using a program by Lahey and Beasley (2009). Names on resumes were used to indicate race/ethnicity and gender.

The design allowed for various combinations of race/ethnicity and gender-specific names. The study included educational backgrounds and work experiences linked to public flagship universities in the US. Diverse majors (economics, finance, marketing, anthropology, philosophy, chemistry, biology, and psychology) and two minors (history and mathematics), which were common to all the universities in our resumes. Other common elements of a resumes of college graduates, such as internships, GPAs, volunteer experiences, language skills, and computer proficiency were randomized.
External Link(s)

Registration Citation

Citation
Nunley, John and Alan Seals. 2024. "A Large-Scale Resume Audit of the U.S. Labor Market for New College Graduates (2016-2017)." AEA RCT Registry. January 31. https://doi.org/10.1257/rct.12914-1.0
Experimental Details

Interventions

Intervention(s)
We randomize elements of thousands of work resumes, such as Black- and White-sounding names, to compare the responses of firms to this information.
Intervention Start Date
2016-03-15
Intervention End Date
2017-07-31

Primary Outcomes

Primary Outcomes (end points)
Firm Response, Call Back, Interview Request, Request for other candidate information
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We conducted identical resume audits in 2016 and 2017. The audits began in March and ended in July each year. Using a popular internet job search board, we submitted 36,880 randomly generated resumes to online ads chosen at random from a bank of jobs constructed by our research team. In lieu of submitting resumes to many different types of jobs, the bank of job ads includes those from the following job categories: account executive, banking, customer service, finance, insurance, and marketing.3 Ads requiring certifications or expertise in a foreign language as well as those requiring company-specific applications were excluded from the job bank. Although it is common in the audit literature for researchers to audit select cities/labor markets (Bertrand and Mullainathan, 2004; Lahey, 2008; Nunley et al., 2015; Kroft et al., 2013), we focus on the US labor market but impose no additional location restrictions regarding which ads enter the job bank. After creating the bank of jobs, our research assistants–using randomly generated resumes–applied to job openings that were randomly selected from the job bank. Four resumes were submitted to each ad selected to be audited, and applications were submitted to 9, 220 unique advertisements. The result is 36, 880 observations (= 9, 220 jobs × 4 applications).
We use the program developed by Lahey and Beasley (2009) to randomly assign a name, address, university, major, and work experience obtained during college to the resumes. In line with the audit literature (e.g., see Bertrand and Mullainathan, 2004), we use names that are distinct along racial/ethnic and gender lines to signal race/ethnicity and gender to prospective employers with the goal of detecting the existence of discrimination as well as the extent of discrimination. The names used in our study are presented in Table 1. Our design does not restrict the combinations of racial/ethnic-sex specific names that could be submitted to a given job ad. For example, consider applications submitted by fictive applicants with distinctively Black names. Of the 9,220 job openings to which we applied, 15 percent did not receive a submission from a Black male or a Black female; 23 percent received an application from one Black female and zero from Black males, and vice versa (also around 23 percent); 22 percent received applications from one Black male and one Black female; Around 5 percent received two applications from Black females and zero applications from Black males, and vice versa (also around 5 percent); 3 percent received three distinctively Black applications (two males and one female, or two females and one
male); and less than 1 percent received four distinctively Black applications (two males and two females). The analogous percentages for the distinctively White and Hispanic names are similar to those for the Black names. The addresses assigned to fictive applicants are tied to the university to which they are assigned. The addresses were chosen such that the fictive applicant lived within a few miles of the university from which they will receive or have received (depending on the month of application) a Bachelor’s degree. Per our IRB agreement, we are unable to provide the names of the universities, but we note that each is a public, flagship university, and the chosen universities span the continental US. Using Census region groupings, three of the universities are located in the Southeast, two in the Southwest, two in the West, three in the Midwest, and two in the Northeast. Eight majors are incorporated into the design and assigned to fictive applicants at equal
probability (12.5 percent): economics, finance, marketing, anthropology, philosophy, chemistry, biology, and psychology. The work experience accumulated during college that is assigned to the applicants includes three different types and was assigned at equal prob-
ability (i.e. = 1/3): retail/sales, restaurant/coffee shop, and university employment (e.g., library, campus recreation, dining services). We are unable to provide the job titles or firm names associated with the applicant’s college work experience due to our IRB agreement. Portions of the fictive applicants are assigned minors, internship experience, grade point averages (GPA), volunteer experience, fluency/proficiency in speaking Spanish, receipt of study-abroad scholarships, and different sets of computer skills (e.g., programming and
data analysis). In terms of minors, two are included in the design: history and mathematics. These minors are available to students from each of the universities used in our experiment. The internships assigned to the fictive applicants’ resumes are grouped into
two categories: analytical and social. For example, we include a number of internships with the titles ”Marketing Analyst Intern”, ”Financial Analyst Intern”, and ”Research Intern” in the ”analytical” category, and ”Marketing Sales”, ”Financial Sales”, and ”General Sales” into the ”social” grouping. Twenty-five percent of applicants report no information regarding their GPA. The remaining applicants report GPAs of 3.0, 3.2, 3.4, 3.6, 3.8, and 4.0. The probability of being assigned no GPA or one of the other six possibilities is 12.5 percent. The volunteer experiences randomly assigned to fictive applicants vary across three different types. We are unable to reveal the types of volunteer experiences due to the prospect of violating our IRB agreement. Even revealing the type of charitable work performed by the organizations would likely reveal their identities. Twenty-five percent of the fictive applicants were assigned a study abroad experience, but these experiences vary across seven countries: Argentina, China, Dubai, Italy, Japan, Mexico, and South Africa. Conditional on being assigned a study abroad experience, the country in which the experience takes place is randomly assigned with equal probability. In terms of computer
skills, applicants report the following on their resumes: no computer-related skills or information (25 percent), basic skills (25 percent), data analysis (25 percent), programming (12.5 percent), and the combination of data analysis and programming (12.5 percent)
Experimental Design Details
Randomization Method
Advertisements from a prominent Job Board were selected and randomly assigned four (randomly generated) resumes.
Randomization Unit
Individual Resumes
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
10,000 job advertisements (9, 220 job advertisements were applied to)
Sample size: planned number of observations
40,000 resumes (36,880 resumes were used)
Sample size (or number of clusters) by treatment arms
36,880
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
We did not conduct power calculations for this experiment, as the number of observations and clusters would exceed the traditional minimums.
IRB

Institutional Review Boards (IRBs)

IRB Name
Auburn University Human Research Protection Program: Institutional Review Board for the Protection of Human Subjects in Research (IRB)
IRB Approval Date
2016-05-20
IRB Approval Number
16-115 D: Do Not Constitute Human Subjects Research

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
July 31, 2017, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
July 31, 2017, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
9, 220 job advertisements
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
36,880 observations (= 9, 220 job advertisements × 4 applications per advertisement)
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication

Data Publication

Is public data available?
No

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials