Experimental Design
For 2016-2017:
Randomization was conducted in two “tiers.” Tier 1 was made up of schools that participated in our 2015-2016 RCT. Because school counselors may have provided prior years materials to students, and for ethical reasons, we decided that all schools in Tier 1 would receive a treatment, even if they were assigned to control in 2015-2016. Thus these schools contribute to estimating contrasts across treatments, but not comparisons to the control group. We retained the 39 blocks formed by matching in the previous year. See https://www.socialscienceregistry.org/trials/2951/history/37072 and http://www.nber.org/data-appendix/w24471/w24471.appendix.pdf for details on that randomization process. Within these 39 blocks, schools were randomly assigned to one of the three Fast Facts treatments, SchoolFinder, or the App. By block, the Fast Facts schools within the block were randomly assigned to either digital only delivery or digital and paper delivery.
Randomization for Tier 2 consisted of schools new to the study in 2016-2017, and were high and medium poverty schools in NYC. The approximately 100 lowest poverty schools in NYC were excluded from the experiment. Schools were randomized regardless of their intent to participate; that is, we did not recruit schools in advance for participation. We then blocked these schools into blocks of 6 schools (were possible). With 6 schools in a block, the modal block assigned one school to each of the three Fast Facts versions, one school to School Finder, one to the App, and one to control. Blocks were thus matched sextuplets of schools selected using a Mahalanobis distance measure of difference between schools (see Bruhn & McKenzie 2009; King et al. 2007). School variables used in the matching procedure included prior choice outcomes (e.g., the mean graduation rate of first round matches in 2015-16), prior achievement (mean ELA and math scores in 2015-16), economic disadvantage (the percent of students in poverty), and school size. To maintain face validity, blocking was conducted within borough and geographically isolated schools were blocked together (i.e. the Rockaways and Staten Island). Additionally, schools were blocked within categories based on their response to recruitment for the 2015-2016 experiment so that blocks were formed within groups of schools that had similar characteristics (e.g. school has returning 8th graders, school did not choose to participate in 2015-2016, school new to study, etc.) Within these blocks, schools were randomly assigned to treatments or control. The blocks were then listed in a random order and a cross randomization that alternated Fast Facts delivery method (digital only or digital and paper) was implemented.
For 2017-2018:
In the second year of the study, we used the extant randomization described above for 2017-2018. Schools that were previously assigned to one of the Fast Facts interventions were assigned to Fast Facts again (the low-graduation version) and schools that were assigned either School Finder or the App were assigned the App. Materials were “refreshed” and delivered electronically to guidance counselors. If a school received a new counselor since the previous year’s treatment, they were sent additional paper materials.