x

We are happy to announce that all trial registrations will now be issued DOIs (digital object identifiers). For more information, see here.
An Audit Study of Ban the Box Legislation
Last registered on June 15, 2016

Pre-Trial

Trial Information
General Information
Title
An Audit Study of Ban the Box Legislation
RCT ID
AEARCTR-0000675
Initial registration date
April 16, 2015
Last updated
June 15, 2016 9:46 AM EDT
Location(s)
Region
Primary Investigator
Affiliation
Rutgers University
Other Primary Investigator(s)
PI Affiliation
University of Michigan
Additional Trial Information
Status
Completed
Start date
2015-01-31
End date
2016-06-14
Secondary IDs
Abstract
This study investigates the labor market effects of Ban-the-Box laws, which restrict employers' access to and use of information about job applicants' criminal records.
External Link(s)
Registration Citation
Citation
Agan, Amanda and Sonja Starr. 2016. "An Audit Study of Ban the Box Legislation." AEA RCT Registry. June 15. https://doi.org/10.1257/rct.675-3.0.
Former Citation
Agan, Amanda and Sonja Starr. 2016. "An Audit Study of Ban the Box Legislation." AEA RCT Registry. June 15. https://www.socialscienceregistry.org/trials/675/history/8853.
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2015-01-31
Intervention End Date
2015-03-31
Primary Outcomes
Primary Outcomes (end points)
Whether the applicant receives a call or email from the potential employer. These responses will be further classified as "positive response" (any call or email suggesting potential interest) or "interview" (specifically requests interview). The main outcome of interest will be "positive responses."
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Will be made public when the trial is complete
Experimental Design Details
This study investigates the labor market effects of Ban the Box legislation, which prohibits employers from asking about criminal records on job applications. The design is a correspondence audit study focusing on entry-level employers in New Jersey, which adopted a Ban-the-Box law effective March 1, 2015. Fictitious online job applications will be sent to these entry-level employers both before the law goes into effect (the pre-period) and after the law goes into effect (the post-period). We are investigating several questions: (1) What is the effect of stating (on an online job application) that one has a criminal record on the probability that employers will give an applicant a positive response? This question will be principally tested using our pre-period data. (After Ban-the-Box goes into effect, most employers will presumably remove the criminal records question. Some employers do not have the question even in the pre-period.) (2) Does this effect vary depending on the applicant's other characteristics? The primary interaction effect of interest is race x record. We will also test interactions with characteristics that may signify incarceration: GED (vs high school diploma) and employment gaps, and we will test whether race interacts with these other characteristics. (3) What is the main effect of other applicant characteristics on positive response probability? Again, we will assess race (the effect of primary interest), education type, and employment gap effects, using both pre- and post-period data. (4) Do employers in fact comply with Ban-the-Box by removing the criminal records question? (This is a purely observational question--it is not subject to experimental manipulation.) (5) After Ban-the-Box goes into effect, do employers (and particularly those employers that asked about criminal records in the pre-period) become more likely to discriminate based on characteristics that (in the real world) are correlated with criminal records, in particular race, GED (vs. high school diploma), and employment gap? (6) Before Ban-the-Box, is there cross-sectional variation in the effect of these characteristics (race, GED, work gap) between businesses that ask about criminal records on the application and those that do not? Questions (5) and (6) speak to the same basic question: whether limited access to criminal record information makes employers more likely to statistically discriminate against groups that they perceive as likely to have records. In order to test that question, we will look both at the two-way interactions described in (5) and (6), and assuming we are able to build a sample that provides sufficient statistical power, at three-way interactions that exploit both cross-sectional and temporal variation (e.g., race*period*box). Our focus is New Jersey, but other states may be added if they adopt Ban-the-Box while we are working on the study. We will send (fictitious) online job applications to businesses who have posted availability for jobs that are suitable for candidates with limited work experience, no post-secondary education, and no specialized skills. If there are open jobs in both the pre- and post-period, each business will receive four applications total, one pair in each period. The applications will be similar on all but our randomly assigned treatment dimensions. Those dimensions are: (1) Has felony criminal conviction or not (1a) Conditional on being convicted of a crime: drug conviction or theft conviction (2) Race: Black or White (3) Has 1-year employment gap or not (4) GED or High School Diploma Characteristics will be independently randomly assigned to a job applicant with equal (50%) probability. Race is indicated via the name of the applicant – with distinctly black and distinctly white names chosen based on a study of birth certificate records from New Jersey. All other job applicant characteristics are assigned to be as similar as possible, but randomly varied slightly so as to disguise the similarity of the applications. These include: (1) Work history: All job applicants have approximately 3.5 years of work experience - approximately 2 years as a crew member at a fast food chain or convenience store and approximately 1.5 years in a manual labor job such as home improvement, landscaping, or moving. All applicants are unemployed at the time of the job application, having ended their most recent job 2-3 months before the application is sent. Descriptions of job duties and reasons for leaving jobs were slightly varied. (2) Address: Because it is likely that employers will be concerned about employees being able to get to work, we want applicants to live near the jobs to which they apply. To achieve this, we chose 40 cities/towns in NJ that are nearest to the most open jobs on snagajob.com using an optimization tool included in ArcGIS software. These 40 town/cities serve as “centers” where our applicants will have address and local phone numbers. Within these centers, 4 addresses are chosen from neighborhoods that are at least 10% black, at least 20% white, and do not have a median income above $100,000. Addresses are then slightly perturbed so as not to represent real addresses. These addresses are then randomly assigned to applicants. (3) High school (if Diploma earner): High schools were chosen to be in cities that are at least 10% black, at least 20% white, have at least 25,000 people, and do not have median income above $100,000. In addition, the high schools do not have test scores above the 90th or below the 10th percentile. Then applicants in a center were randomly assigned from this set of high schools that are at least 30 miles from the center (but within New Jersey), to reduce the probability that the high school can send any unobservable signals to the employer. Applicants with GEDs were randomly assigned descriptions/names of GED training programs. (4) References: Two fictitious references with phone numbers were created, representing the applicant's supervisors for each previous job. (5) Phone number: Each applicant will be assigned a phone number (purchased from callfire.com) based on their center, race, criminal history, and time period (that is each, each center has 4 potential phone numbers during each time period); the result of this division is that no business will ever receive two applications using the same phone number. The wording and voice on the outgoing voicemail message was randomized. (6) Email address: A unique email address will be created for each applicant, with the format randomly varied. We then randomly create applicant profiles for each center via the Resume Randomizer program created by Lahey and Beasley (2009). An applicant profile consists of: a name, a phone number (assigned at the center x race x crime level), an address, an employment history, a unique email address, 2 references with phone numbers, information on high school diploma or GED receipt, a criminal history status and information about the criminal charge, and a formatted resume (for use if application requires resume only). Finding Jobs: We will use two strategies to identify job openings for which our applicants appear qualified: (1) snagajob.com A large online job board dedicated to hourly employment (2) Job applications sent directly to job openings listed on the websites of chain businesses with at least 30 locations and 300 employees in NJ, per the BusinessUSA database. Industries include restaurants, convenience stores, department stores, home centers, grocery and convenience stores, pharmacies, miscellaneous retail, service stations, and hotels/motels. Applying for the job: Each RA will be randomly assigned a "center" in which to search for jobs via the above-described methods. Once a job is identified, a profile is randomly chosen from that center to apply for the job. The RA will then use the information contained in the profile to fill out the job application. For each job vacancy up to two job applications will be sent in each time period. In the case that two applications are sent, we use the resume randomizer program to ensure that the second application to the same business doesn’t use the same race (and thus name and phone number), address, or criminal conviction type. The time between first and second job applications will vary. It is possible that multiple locations of the same chain business could use the same hiring managers. To avoid allowing these managers to view extremely similar applicants from the same towns and thus arousing potential suspicion, whenever an RA applies to the same company within a geographic center, he/she will always use the same pair of applicant profiles. This mirrors real-world applicant behavior (applying to multiple nearby locations at once is common). Applications to locations in different centers, in contrast, will differ. Applications from different centers are less similar to one another (different cities of residence, most recent employers, high schools, etc.), so even if a manager covers locations in multiple centers, this is less likely to trigger suspicion. Application record-keeping: While filing the job application, the RA will also fill out a spreadsheet that indicates, among other things, which profile was sent to the job, the date and time of the job application, the name of the company being applied to, the name of the position, address/location of the job, when the job was posted (if available), whether the application asked about criminal history. Outcome data collection: We will record whether the applicant receives a call or email from the employer. Responses will be further classified as "positive response" (any contact suggesting interest) or "interview" (specifically requests interview). The main outcome of interest will be "positive response." Data will be collected for 8 weeks from the application date. Note that we are registering this study after
Randomization Method
Randomization performed via computer
Randomization Unit
Individual applicant level. Note that some individual applicants' applications were sent to multiple businesses (within the same chain and geographic center), as described above, so the number of unique applicants is lower than the number of applications.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
Approximately 2000 unique applicants.

Note: While randomization was at the applicant level, standard errors may be clustered at a different level, e.g.:
--approximately 1500 businesses, each of which receives up to 4 applications
--approximately 125 corporate chains * 40 geographic regions = 500 clusters (grouping nearby locations of the same chains, to which we are sending the same sets of applications)
Sample size: planned number of observations
Approximately 4,000 to 8,000 applications (best estimate is 6,000)
Sample size (or number of clusters) by treatment arms
For each experimentally manipulated attribute, 50% of the sample will have one attribute and 50% the other. This will lead to:

Approximately 1,000 to 2,000 black pre-period
Approximately 1,000 to 2,000 black post-period
Approximately 1,000 to 2,000 white pre-period
Approximately 1,000 to 2,000 white post-period

Same division for other characteristics of interest (GED/HS Diploma, employment gap/no employment gap, criminal record/no criminal record) which will also be split 50/50 in both the pre- and post-periods. The criminal record arm will further be split into drug crimes/property crimes.

Note that we are interested in the interactions of these characteristics with whether (in the pre-period) the employer asks about criminal records on the initial employment application. This feature is determined by the research subjects, however, and is not experimentally manipulated.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Of course, final power calculations will depend on the exact feasible sample size (which will be based on how many eligible jobs openings we can identify and constrained by timing considerations due to the law change) and the baseline probability of a call-back in our sample. In previous audit studies, positive response call-backs have varied widely. Some examples include: Lahey (2008): 8-10% Oreopolous (2011): 11-16% Deming et al (2014): 8.2% Bertrand & Mullainathan (2004): 8% Phillips (2015): 18.7% Pager (2004): ~17% (in person applications) Pager et al (2009): 23% (in person applications) The first 4 audit studies use resumes and mostly college level jobs. The final 3 are more focused on the low-wage, low-skilled sector - Phillips uses both resumes and online applications, Pager and Pager et al use in person job applications. In a linear model, if we assume a call-back probability of 15% (conservative based on the studies most closely related to ours), a standard deviation of 0.125 (.15*(1-.15)), power=.8 and alpha=0.05 then our minimum detectable effect size for the main effects of our manipulated characteristics (of, e.g., race) is 0.9 percentage points or 6 percent. As our outcome variable is binary, however, using the same call-back probability of 15% and sample size of 6000, making some simplifying assumptions, ignoring clustering, and using the calculation described in Demidenko E. (2007), we should have the power to detect a main effect (of, e.g., race) with an odds ratio of 1.217, in a simple bivariate logit regression. See http://www.dartmouth.edu/~eugened/power-samplesize.php. In analyses that include interaction effects, power will be reduced. On the other hand, we also intend to conduct "within-subjects" analyses for the businesses that we are able to send applications to in both periods; these analyses should be able obtain greater power with a smaller sample size, although they will be limited only to a subset of the sample. References: Bertrand, M. and S. Mullainathan (2004). "Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination." The American Economic Review 94(4): 991-1013 Demidenko E. (2007). "Sample size determination for logistic regression revisited." Statistics in Medicine 26:3385-3397 Deming, D., N. Yuchtman, A. Abulafi, C. Goldin and L. Katz (2014). "The value of postsecondary credentials in the labor market: an experimental study". NBER Working Paper #20528 Lahey, J. (2008). "Age, women, and hiring: an experimental study" The Journal of Human Resources 43(1): 30-56 Oreopolous, P. (2011). "Why do skilled immigrants struggle in the labor market? A field experiment with thirteen thousand resumes" American Economic Journal: Economic Policy 3:148-171 Pager, D. (2003). "The Mark of a Criminal record" American Journal of Sociology 108(5): 937-975 Pager, D., B. Western, and B. Bonikowski (2009). "Discrimination in a low-wage labor market: a field experiment" American Sociological Review 74:777-799 Phillips, D. (2015). "Neighborhood affluence or long commutes: Using a correspondence experiment to test why employers discriminate against applicants from poor neighborhoods" Unpublished working paper
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Princeton University Institutional Review Board
IRB Approval Date
2014-12-05
IRB Approval Number
6950
IRB Name
University of Michigan Health Sciences & Behavioral Sciences Institutional Review Board
IRB Approval Date
2014-12-17
IRB Approval Number
HUM00096580
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
Yes
Intervention Completion Date
March 31, 2016, 12:00 AM +00:00
Is data collection complete?
Yes
Data Collection Completion Date
May 26, 2016, 12:00 AM +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers