Education for All? A Nationwide Audit Study of School Choice
Last registered on January 10, 2020

Pre-Trial

Trial Information
General Information
Title
Education for All? A Nationwide Audit Study of School Choice
RCT ID
AEARCTR-0005288
Initial registration date
January 09, 2020
Last updated
January 10, 2020 11:38 AM EST
Location(s)
Region
Primary Investigator
Affiliation
University of Florida
Other Primary Investigator(s)
PI Affiliation
Teachers College, Columbia University
Additional Trial Information
Status
Completed
Start date
2014-11-01
End date
2019-09-30
Secondary IDs
Abstract
School choice may allow schools to “cream skim,” or impede access, to students perceived as easier to educate or challenging to educate. To test this, we conducted two field experiments, where we sent emails from fictitious parents to charter schools and traditional public schools subject to school choice in 29 states and Washington, D.C. The fictitious parent asked whether any student is eligible to apply to the school and how to apply. Each email signaled a randomly assigned attribute of the child. These randomly assigned attributes signaled the child had one of four conditions about their behavior: a significant special need, poor behavior, high or low prior achievement, and no indication of a condition. Furthermore, fictitious parents and children were randomly assigned attributes, including ethnic-sounding names, gendered-sounding names, whether the child is raised in a two-parent versus one-parent household, and whether the child is a son or daughter. E-mail responses are monitored to detect any systematic evidence of discriminatory practices.
External Link(s)
Registration Citation
Citation
Bergman, Peter and Isaac McFarlin. 2020. "Education for All? A Nationwide Audit Study of School Choice." AEA RCT Registry. January 10. https://doi.org/10.1257/rct.5288-1.0.
Sponsors & Partners
Sponsor(s)
Experimental Details
Interventions
Intervention(s)
Across two field experiments, we sent email messages to charter schools and traditional public schools subject to school choice. We framed each message as coming from a parent looking for a school. The parent contacts the school to ask about their child’s eligibility and how to apply. The baseline message indicated that the parent is looking for a school for their son or daughter and they would like to know whether anyone can apply to that school and how to apply. Each treatment message added a sentence to this baseline message to signal a child’s potential cost to educate, disadvantage, or prior academic performance. This sentence indicated the child has one of the following: an IEP requiring they be taught in a classroom separate from mainstream students; poor behavior; bad grades; or good attendance and good grades. In each experiment, we sent two emails to each school three to four weeks apart.

Intervention Start Date
2018-01-15
Intervention End Date
2018-03-30
Primary Outcomes
Primary Outcomes (end points)
Whether charter schools and traditional public schools subject to school choice policies respond to inquiries from fictitious parents about the application process.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
The first audit study focused exclusively on charter schools from 17 states that have the largest number of charter schools; it was implemented between November and December 2014. The sample of schools included in the first experiment is 3,131 charter schools or roughly half of all the charter schools in the country at that time. The treatment messages were randomly assigned at the school level in the first experiment.

Based on feedback we received from researchers (in November 2016), we conducted a second audit study of both charter schools matched to traditional public schools subject to school choice policies. The purpose of this exercise is to compare response rates between traditional public schools and charter schools. Specifically, we limited the geographic scope of our study to areas that practice some type of intra-district school choice. For every charter school in these areas, we matched it to the nearest traditional public school with the same entry-grade level and within the same school district boundaries. This latter sample consists of 4,338 schools or 2,169 matched-pairs of traditional public schools and charter schools. For the second experiment, treatment messages were clustered at the matched-pair level; identical messages from the same fictitious parent were sent to each charter school and its matched-paired traditional public school.

Within each experiment, no school received the same message treatment twice and a school was assigned a treatment without replacement. We also randomized the order in which schools were contacted.
Experimental Design Details
Randomization Method
Randomization was done in office by a computer.
Randomization Unit
In the first field experiment conducted in 2014, randomization was conducted at the school level. For the second field experiment conducted in 2018, randomization was conducted at the level of matched-pairs between traditional public schools and charter schools (see above).
Was the treatment clustered?
Yes
Experiment Characteristics
Sample size: planned number of clusters
The treatment was clustered for the second field experiment, and there were 2,169 matched-pairs of charter schools and traditional public schools, or a total of 4,338 schools.
Sample size: planned number of observations
For the first experiment, there are 3,131 observations. For the second field experiment, there are 4,338 schools within 2,169 clusters.
Sample size (or number of clusters) by treatment arms
First experiment conducted in 2014: 2,082 messages sent for baseline message; 2,091 messages sent for the IEP (individualized education plan) message that signals whether a child has a significant special need; 1,026 messages sent that signal the child has poor behavior; and 1,011 messages sent that signal the child has poor grades.

Second experiment conducted in 2018: 2,949 messages sent for baseline message; 1,431 messages sent for the IEP (individualized education plan) message that signals whether a child has a significant special need; 1,436 messages sent that signal the child has poor behavior; 1,430 messages sent that signal the child has poor grades; and 1,434 messages sent that signal the child has good grades and attendance.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Not applicable.
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Teachers College IRB
IRB Approval Date
2018-02-03
IRB Approval Number
15-118 Protocol
IRB Name
University of Florida
IRB Approval Date
2017-10-30
IRB Approval Number
IRB201702513
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers