Back to History Current Version

Technology, Information, and School Choice: A Randomized Controlled Trial

Last registered on January 25, 2022

Pre-Trial

Trial Information

General Information

Title
Technology, Information, and School Choice: A Randomized Controlled Trial
RCT ID
AEARCTR-0003736
Initial registration date
January 04, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 04, 2019, 10:49 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
January 25, 2022, 3:15 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Teachers College Columbia University

Other Primary Investigator(s)

PI Affiliation
Vanderbilt University
PI Affiliation
Princeton University
PI Affiliation
UCSB

Additional Trial Information

Status
Completed
Start date
2016-06-01
End date
2019-06-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
New York City requires all 8th grade students to engage in a school choice process to be assigned to a high school through a centralized assignment mechanism. The admissions process is complex and cognitively demanding, requiring students to select among schools with different screening methods, curricular themes, and admissions preferences. To test the potential of information to help students navigate the process, we fielded a series of informational interventions in a school-level randomized controlled trial. The initial year of our study tested the effect of a paper-based informational tool – a list of relatively high-performing high schools -- in a school-level RCT of 165 high-poverty New York City middle schools (Corcoran, Jennings, Cohodes, and Sattin-Bajaj, 2018). We found that access to the informational tool increased applications to listed schools and decreased the likelihood that students attend a high school with a graduation rate below 70%.

We extended this project for an additional two years as an school-level RCT, which expanded the number of schools served to over 450 and the types of interventions offered to include a variety of technological innovations. We contrast a paper and a digital version of a curated list of schools, a guided search tool, and a directory search tool to a control group. School personnel received the intervention tools to distribute via mail, along with supportive materials (lesson plans, video guides, and support from the study office), simulating the most likely mode of dissemination if a school district were to adopt such tools. This was in contrast to the prior year’s study where trained research assistants delivered materials in a lesson. In 2016-2017, schools were randomly assigned to one of three informational interventions or a control group. In 2017-2018, we replicated the prior years random assignment to control or treatment and offered two informational interventions in the treated groups.
External Link(s)

Registration Citation

Citation
Cohodes, Sarah et al. 2022. "Technology, Information, and School Choice: A Randomized Controlled Trial." AEA RCT Registry. January 25. https://doi.org/10.1257/rct.3736-2.1
Former Citation
Cohodes, Sarah et al. 2022. "Technology, Information, and School Choice: A Randomized Controlled Trial." AEA RCT Registry. January 25. https://www.socialscienceregistry.org/trials/3736/history/197672
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Schools were assigned to one of three informational interventions or a control group. School personnel received the intervention tools to distribute via mail, along with supportive materials (lesson plans, video guides, and support from the study office), simulating the most likely mode of dissemination if a school district were to adopt such tools. Study personnel provided telephone support to encourage material use and answer questions. The three informational interventions were:

1) Curated List of High Schools ("Fast Facts"): this group received a middle school-specific concise listing of 26 geographically proximate high schools with graduation rates above 75%, along with travel time information, the graduation rate, and information about how to apply. In cross-randomization within this group, students received either a digital or paper version of the tool. A further cross-randomization made slight variations to the high school list. A third of high school lists included a caution about two nearby high schools that had low graduation rates and a third of the lists included a caution about two nearby high schools that had low admissions rates.

2) Personalized Information about High Schools from a School Choice “App”: this group received a guided introduction to an interactive smartphone/web-based tool designed to help students translate their preferences into a list of school recommendations. The app serves as a guided search tool, prompting students to identify their current middle school and their preferences for commute time, academic interests, and extra-curricular interests. It then generates a list of schools, along with performance data, that students can save, share, and explore further.

3) General Information about High Schools from a Searchable Directory ("School Finder"): This group received a guided introduction to the NYCDOE SchoolFinder, a search engine for finding high schools launched in the 2016-17 high school admissions cycle. Since all students had access to this tool (including in the control group), this group allows us to test the effect of the supportive materials that were offered as part of the intervention.
Intervention Start Date
2016-10-04
Intervention End Date
2017-10-02

Primary Outcomes

Primary Outcomes (end points)
a. High school choice outcomes (for 1st, top 3, and all choices; matched school; and enrolled school)
i. School presence on the Fast Facts and/or supplementary lists
1. presence on the list
2. presence in the top or bottom half of the list
ii. High school characteristics
1. 4-year graduation rate
2. whether the graduation rate was below the Fast Facts threshold of 70%
3. travel time from the middle school to the high school
4. location in the same borough as the student’s residence
5. applications per seat in the prior year (a measure of demand)
6. admissions method (e.g. screened or limited unscreened, different types of high schools in the NYC system)
7. variability in graduation rates (the difference between the highest and lowest graduation rate of schools appearing on a student's application)
8. academic or career interest area (e.g. STEM)
9. within-application consistency in interest area, calculated as the highest percentage of choices from the same interest area
i. Other admissions process and enrollment outcomes
1. number of choices submitted (up to12)
2. open house priority status for limited unscreened programs (a lottery preference given to students who attend an event for certain schools)
3. ranking of the student by screened programs
4. whether or not the student was matched to his or her 1st choice, 1st-3rd choice, or any choice in the first round
5. participation in the second round after a successful match
6. matriculation to the matched school in 9th grade. Other outcome variables
b. Students’ academic progress and other outcomes in high school
i. Credit accumulation in 9th-12th grade, and GPA;
ii. New York State Regents exam scores and passing rates;
iii. On-time graduation;
iv. Engagement and perceptions of school (e.g. attendance rate and responses to the NYC DOE’s Learning Environment Survey)
C. Post-high school outcomes (pending data availability)
i. College enrollment, persistence, and graduation
ii. Earnings
iii. Arrest records
iv. Voter registration, voting
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design

For 2016-2017:
Randomization was conducted in two “tiers.” Tier 1 was made up of schools that participated in our 2015-2016 RCT. Because school counselors may have provided prior years materials to students, and for ethical reasons, we decided that all schools in Tier 1 would receive a treatment, even if they were assigned to control in 2015-2016. Thus these schools contribute to estimating contrasts across treatments, but not comparisons to the control group. We retained the 39 blocks formed by matching in the previous year. See https://www.socialscienceregistry.org/trials/2951/history/37072 and http://www.nber.org/data-appendix/w24471/w24471.appendix.pdf for details on that randomization process. Within these 39 blocks, schools were randomly assigned to one of the three Fast Facts treatments, SchoolFinder, or the App. By block, the Fast Facts schools within the block were randomly assigned to either digital only delivery or digital and paper delivery.

Randomization for Tier 2 consisted of schools new to the study in 2016-2017, and were high and medium poverty schools in NYC. The approximately 100 lowest poverty schools in NYC were excluded from the experiment. Schools were randomized regardless of their intent to participate; that is, we did not recruit schools in advance for participation. We then blocked these schools into blocks of 6 schools (were possible). With 6 schools in a block, the modal block assigned one school to each of the three Fast Facts versions, one school to School Finder, one to the App, and one to control. Blocks were thus matched sextuplets of schools selected using a Mahalanobis distance measure of difference between schools (see Bruhn & McKenzie 2009; King et al. 2007). School variables used in the matching procedure included prior choice outcomes (e.g., the mean graduation rate of first round matches in 2015-16), prior achievement (mean ELA and math scores in 2015-16), economic disadvantage (the percent of students in poverty), and school size. To maintain face validity, blocking was conducted within borough and geographically isolated schools were blocked together (i.e. the Rockaways and Staten Island). Additionally, schools were blocked within categories based on their response to recruitment for the 2015-2016 experiment so that blocks were formed within groups of schools that had similar characteristics (e.g. school has returning 8th graders, school did not choose to participate in 2015-2016, school new to study, etc.) Within these blocks, schools were randomly assigned to treatments or control. The blocks were then listed in a random order and a cross randomization that alternated Fast Facts delivery method (digital only or digital and paper) was implemented.

For 2017-2018:
In the second year of the study, we used the extant randomization described above for 2017-2018. Schools that were previously assigned to one of the Fast Facts interventions were assigned to Fast Facts again (the low-graduation version) and schools that were assigned either School Finder or the App were assigned the App. Materials were “refreshed” and delivered electronically to guidance counselors. If a school received a new counselor since the previous year’s treatment, they were sent additional paper materials.
Experimental Design Details
Randomization Method
computer program
Randomization Unit
school
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
In 2016-17: 468 schools
In 2017-18: 453 schools
Sample size: planned number of observations
In 2016-17: 59,492 students In 2017-18: 58,614 students
Sample size (or number of clusters) by treatment arms
In 2016-17:
Control: 59 schools
Fast Facts: 249 (123 digital delivery, 126 digital and paper delivery; 82 classic Fast Facts, 83 Fast Fasts "low-graduation", 84 Fast Facts "low-odds")
School Finder: 80
App: 80
In 2017-2018:
Control: 59 schools
Fast Facts: 242 schools (digital delivery of the low-grad version)
App: 152 schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Our power calculations were done before conducting the first year of the experiment. For estimating MDES we made standard assumptions regarding power (.80) and significance (α=.05). We assumed 150 eighth graders per cluster (school), the NYC average, and 2 clusters per block. (In practice there will be 4 schools per block, representing our 3 treatments and one control. For illustrative purposes we provide the MDES for only one treatment-control comparison). We assumed two inter-class (within-middle school) correlations: a relatively low ICC (ρ=0.08) and a modest ICC (ρ=0.20). These correspond to the range of ICCs observed for several outcomes in existing administrative data, after removing the effect of pre-treatment covariates. We assumed a conservative R2 from regression on pre-treatment covariates of 0.1, and that the blocks account for 0.1 to 0.4 of the variance. For example, with 60 schools (30 blocks), we have sufficient power to detect effect sizes ranging from 0.20 under the low ICC, high between-block variation assumption, to 0.35 under the high ICC, low between-block variation assumption. These translate into an increase of 2.3 to 4.1 percentage points in the four-year graduation rate of the first choice high school (or average of the top three choices), 1.1 to 1.7 points in high school value-added to graduation rates, and 2.4 to 3.5 percentage points in the percent of students accumulating 10 or more credits in year one of high school at the first choice (or average of the top three). This exercise offers some confidence in our decision to recruit at least 30 schools per treatment group, and control group.
IRB

Institutional Review Boards (IRBs)

IRB Name
New York University
IRB Approval Date
2015-06-12
IRB Approval Number
15-10671
IRB Name
Teachers College Columbia University
IRB Approval Date
2016-07-18
IRB Approval Number
16-420

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
February 01, 2018, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
January 01, 2022, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
473
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
115,126 students
Final Sample Size (or Number of Clusters) by Treatment Arms
136 schools were assigned to one of the three Fast Facts treatments, 58 to the App, 58 to School Finder, and 60 to control. Within the Fast Facts treatment, 68 schools each were assigned to digital or digital and paper delivery of the intervention. 45 schools were assigned to Fast Facts and Fast Facts low graduation, with 46 schools assigned to Fast Facts low odds.
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
No
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
This paper reports the results of a large, school-level randomized controlled trial evaluating a set of three informational interventions for young people choosing high schools in 473 middle schools, serving over 115,000 8th graders. The interventions differed in their level of customization to the student and their mode of delivery (paper or online); all treated schools received identical materials to scaffold the decision-making process. Every intervention reduced likelihood of application to and enrollment in schools with graduation rates below the city median (75 percent). An important channel is their effect on reducing nonoptimal first choice application strategies. Providing a simplified, middle-school specific list of relatively high graduation rate schools had the largest impacts, causing students to enroll in high schools with 1.5-percentage point higher graduation rates. Providing the same information online, however, did not alter students’ choices or enrollment. This appears to be due to low utilization. Online interventions with individual customization, including a recommendation tool and search engine, induced students to enroll in high schools with 1-percentage point higher graduation rates, but with more variance in impact. Together, these results show that successful informational interventions must generate engagement with the material, and this is possible through multiple channels.
Citation
Cohodes, Sarah R., Sean P. Corcoran, Jennifer L. Jennings, and Carolyn Sattin-Bajaj.. “When Do Informational Interventions Work? Experimental Evidence from New York City High School Choice.” National Bureau of Economic Research Working Paper 29690, 2022.

Reports & Other Materials