NEW UPDATE: Completed trials may now upload and register supplementary documents (e.g. null results reports, populated pre-analysis plans, or post-trial results reports) in the Post Trial section under Reports, Papers, & Other Materials.
One Summer Chicago PLUS: Scaling and "Unpacking" a Successful Program
Initial registration date
September 02, 2015
July 31, 2017 12:05 PM EDT
Booth School of Business University of Chicago
Other Primary Investigator(s)
University of Pennsylvania
Additional Trial Information
Chicago's Department of Family and Support Services will provide summer employment and an adult mentor to disadvantaged Chicago youth over the summer of 2015 through its "One Summer PLUS" program. The researchers are assisting DFSS in running a lottery to randomly assign the pool of program applicants into two treatment groups and one control group, and evaluating the outcomes of the program. This random-assignment lottery will allow the program to be evaluated as a randomized controlled trial. Applicants will be randomly assigned to two versions of the program - one including an adult job mentor and one that only provides the job itself - or to a control group.
We will track applicants to the program through existing administrative databases to assess the short- and long-term effects of the government's program, including education, crime, and labor market outcomes. Additionally, we will survey youth (both offered and not offered the program) in order to learn more about the differences in experiences and outcomes between those who were offered the program and those who were not. This is the third in a series of studies of Chicago's One Summer PLUS summer employment program. Registration Citation
Bertrand, Marianne and Sara Heller. 2017. "One Summer Chicago PLUS: Scaling and "Unpacking" a Successful Program." AEA RCT Registry. July 31.
OSC+ youth are offered a six-week summer job at $8.25 per hour. Local community-based non-profit agencies provide a brief training including basic job skill and financial literacy instruction, then place youth in jobs, provide one meal per day, and aim to link youth with opportunities for other social supports after the summer ends. Jobs include government, non-profit, and private sector positions. Programming covers 5 hours per day, 5 days per week.
Half the treatment group will be assigned adult mentors at a ratio of 20:2. Mentors provide continuous feedback and support for the youth, including at their job sites. They provide youth with their cell phone numbers, so they are accessible for advice and problem-solving even outside of work hours. They will also provide a manualized curriculum aimed at developing workforce skills (meaning youth will work fewer hours to make room for this training). The other half of the treatment group will have adult supervisors, but neither intensive mentoring nor the additional skill-development curriculum.
The control group will not be offered any OSC+ services. They will, however, be free to pursue other opportunities on their own, including other summer programming offered by the City of Chicago.
Intervention Start Date
Intervention End Date
Primary Outcomes (end points)
Number of violent crime arrests.
Number of arrests for other types of crime: property, drug, other. Schooling outcomes: graduation rates, GPA, rates of in-school discipline (suspension and expulsion), attendance (days present). Labor market outcomes: employment rates and earnings. Mechanisms such as participants attitudes, from survey responses.
Primary Outcomes (explanation)
Because OSC+ is positioned as a violence-reduction intervention, we are primarily interested in the impact that an offer of program participation has on the number of violent-crime arrests. However, we are also interested in rates of arrest for other types of crime.
Outside of crime, we are also interested in potential program impacts on schooling and labor market outcomes, participants' attitudes about themselves, their education, their career plans, and their futures.
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
The research team randomly assigned 5444 eligible program applicants to one of three arms: job+mentor, job only, and control. Random assignment enables estimation of the causal impact of the OSC PLUS program on a range of outcomes.
Youth assigned to the treatment group are placed in specific job sites by third parties contracted by the city to manage the job placement, conduct the mentoring, and handle payroll for the youth. There are 19 of these providers for the 2015 OSC PLUS program, which are located throughout the city of Chicago.
The study employs a block-randomization design.The blocks were designed to ensure equal probability of treatment across blocks, and also to minimize average distance for participants to providers. Within blocks, participants were randomly assigned to one of the three or four providers, and to one of the treatment groups or control.
Experimental Design Details
Previous empirical evidence has show that the program significantly reduces the number of violent-crime arrests among the treated youth. In addition to replicating the earlier finding on a larger study population, we are interested in isolating the components of the program that are most essential to producing the positive outcomes the City has enjoyed in recent years - e.g. whether reductions in violence have been driven simply by access to work and income, or by the prosocial and behavioral effects of mentoring.
In order to estimate the causal impact of the different program models on our measured outcomes, we randomly assigned eligible program applicants to one of three arms:one that provides a summer job and 24/7 access to an adult mentor as well as 5 hours of mentoring curriculum per week, one that provides the job only, and a control group of no services. Separating the components of the program model into two distinct treatment arms and a control group will allow for a straightforward causal estimate of the effects of the program on the number of violent-crime arrests among treated participants.
In order to study the impact of provider-level characteristics, we employed a block-randomized design to randomly assign applicants to providers and treatment within geographic blocks. Using geocoded address data from the full list of eligible applicants, we designed six blocks based around clusters of providers that were of similar size and reasonable proximity. We also designed the blocks to account for the geographic density of the applicants in relation to the location of provider offices, allowing for an approximately equal probability of treatment across groups while attempting to minimize the average distance traveled by treatment participants.
The block-randomized design will also allow us to examine secondary research questions, including investigating the variation in outcomes across providers. The 19 providers vary in size, operational capacity, and program experience. Exploring the scalability of the OSC program requires greater understanding of the nuances of what characteristics of providers are required for program success.
Randomization was done in office, by computer.
Individual participants were randomized to one of three or four providers and one of two treatment groups or a control within the geographic blocks.
Was the treatment clustered?
Sample size: planned number of clusters
Sample size: planned number of observations
Sample size (or number of clusters) by treatment arms
All six blocks contain study participants in both treatment arms and the control. The treatment arm breakdown in sample size for the entire study population is as follows:
1) 1258 observed youth were randomized to te treatment arm that paired a job with a weekly mentoring component.
2) 1246 youth were randomized to the treatment arm that only provided a job without the mentoring component.
3) 2940 youth were randomized to the control group of no services.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
In a 2012 study, which recruited a similar population of youth and used identical data sources as the proposed project, baseline covariates explained between 0.1 and 0.5 of the variation in outcomes (less for arrests, more for schooling outcomes). Since we have no reason to suspect that the treatment effect will vary by blocks, we do not account for any variability at the block level. With these assumptions, the overall treatment-control contrast has a minimum detectable effect (MDE) of between 0.05 and 0.08 standard deviations. The across-treatment contrast has an MDE of 0.09 to 0.12 SDs.
INSTITUTIONAL REVIEW BOARDS (IRBs)
Social and Behavioral Sciences Institutional Review Board
IRB Approval Date
IRB Approval Number
Post Trial Information
Is the intervention completed?
Is data collection complete?