One Summer Chicago PLUS: Scaling and "Unpacking" a Successful Program

Last registered on July 31, 2017

Pre-Trial

Trial Information

General Information

Title
One Summer Chicago PLUS: Scaling and "Unpacking" a Successful Program
RCT ID
AEARCTR-0000805
Initial registration date
September 02, 2015

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 02, 2015, 1:14 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
July 31, 2017, 12:05 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Booth School of Business University of Chicago

Other Primary Investigator(s)

PI Affiliation
University of Pennsylvania

Additional Trial Information

Status
On going
Start date
2015-05-19
End date
2017-12-31
Secondary IDs
Abstract
Chicago's Department of Family and Support Services will provide summer employment and an adult mentor to disadvantaged Chicago youth over the summer of 2015 through its "One Summer PLUS" program. The researchers are assisting DFSS in running a lottery to randomly assign the pool of program applicants into two treatment groups and one control group, and evaluating the outcomes of the program. This random-assignment lottery will allow the program to be evaluated as a randomized controlled trial. Applicants will be randomly assigned to two versions of the program - one including an adult job mentor and one that only provides the job itself - or to a control group.

We will track applicants to the program through existing administrative databases to assess the short- and long-term effects of the government's program, including education, crime, and labor market outcomes. Additionally, we will survey youth (both offered and not offered the program) in order to learn more about the differences in experiences and outcomes between those who were offered the program and those who were not. This is the third in a series of studies of Chicago's One Summer PLUS summer employment program.
External Link(s)

Registration Citation

Citation
Bertrand, Marianne and Sara Heller. 2017. "One Summer Chicago PLUS: Scaling and "Unpacking" a Successful Program." AEA RCT Registry. July 31. https://doi.org/10.1257/rct.805-2.0
Former Citation
Bertrand, Marianne and Sara Heller. 2017. "One Summer Chicago PLUS: Scaling and "Unpacking" a Successful Program." AEA RCT Registry. July 31. https://www.socialscienceregistry.org/trials/805/history/19981
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
OSC+ youth are offered a six-week summer job at $8.25 per hour. Local community-based non-profit agencies provide a brief training including basic job skill and financial literacy instruction, then place youth in jobs, provide one meal per day, and aim to link youth with opportunities for other social supports after the summer ends. Jobs include government, non-profit, and private sector positions. Programming covers 5 hours per day, 5 days per week.

Half the treatment group will be assigned adult mentors at a ratio of 20:2. Mentors provide continuous feedback and support for the youth, including at their job sites. They provide youth with their cell phone numbers, so they are accessible for advice and problem-solving even outside of work hours. They will also provide a manualized curriculum aimed at developing workforce skills (meaning youth will work fewer hours to make room for this training). The other half of the treatment group will have adult supervisors, but neither intensive mentoring nor the additional skill-development curriculum.

The control group will not be offered any OSC+ services. They will, however, be free to pursue other opportunities on their own, including other summer programming offered by the City of Chicago.
Intervention (Hidden)
The researchers will have three roles. The first two are: 1) assisting DFSS in running a lottery to randomly assign the pool of applicants into two treatment groups and one control group, and 2) evaluating the outcomes of the program. (This random-assignment lottery will allow the program to be evaluated as a randomized control trial. Applicants will be randomly assigned to two versions of the program - one including an adult job mentor and one that only provides the job itself - or to a control group.)

The third role is to administer a survey to a subset of the treatment and control groups, to be carried out by a third-party survey organization. Youth that are designated for survey participation will be contacted in August by phone, email, or postal mail and asked to participate in a phone or online survey that assesses their attitudes about work, school, their summer plans, and their futures. As a participation incentive, a gift card will be offered to any youth that completes the survey.

We will then conduct an outcome evaluation (with data analysis conducted at the University of Chicago) that will document the causal effect of the program and its cost effectiveness.

The outcome evaluation will use existing administrative data from DFSS and collected by their delegate agencies on youth program participation (data agreement forthcoming), as well as data from the Chicago Public Schools (CPS), the Chicago Police Department (CPD) and the state of Illinois Unemployment Insurance records (UI), to document the effect of the program on school engagement, criminal behavior, and future employment.

To perform the outcome evaluation, researchers will use probabilistic matching software to link all study participants with their corresponding records in CPS and CPD administrative data. CPS ID numbers and CPD IR numbers will not be included in the merged data set used for analysis (see data section). The Illinois Department of Employment Security (IDES) will match participants to their unemployment insurance earning recordings using social security numbers provided by DFSS

We will assemble all data pieces into an analysis data set using the random identifier assigned to each applicant. The data will be stored on the Crime Lab's Research Server which will only be accessible to the study personnel. Researchers will securely store the data through a five-year follow-up period. If the one-year and two-year results suggest that a longer-term follow-up would be beneficial, we will submit a revision requesting an extension of the study.
Intervention Start Date
2015-06-29
Intervention End Date
2015-08-14

Primary Outcomes

Primary Outcomes (end points)
Number of violent crime arrests.
Number of arrests for other types of crime: property, drug, other.
Schooling outcomes: graduation rates, GPA, rates of in-school discipline (suspension and expulsion), attendance (days present).
Labor market outcomes: employment rates and earnings.
Mechanisms such as participants attitudes, from survey responses.
Primary Outcomes (explanation)
Because OSC+ is positioned as a violence-reduction intervention, we are primarily interested in the impact that an offer of program participation has on the number of violent-crime arrests. However, we are also interested in rates of arrest for other types of crime.

Outside of crime, we are also interested in potential program impacts on schooling and labor market outcomes, participants' attitudes about themselves, their education, their career plans, and their futures.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The research team randomly assigned 5444 eligible program applicants to one of three arms: job+mentor, job only, and control. Random assignment enables estimation of the causal impact of the OSC PLUS program on a range of outcomes.

Youth assigned to the treatment group are placed in specific job sites by third parties contracted by the city to manage the job placement, conduct the mentoring, and handle payroll for the youth. There are 19 of these providers for the 2015 OSC PLUS program, which are located throughout the city of Chicago.

The study employs a block-randomization design.The blocks were designed to ensure equal probability of treatment across blocks, and also to minimize average distance for participants to providers. Within blocks, participants were randomly assigned to one of the three or four providers, and to one of the treatment groups or control.
Experimental Design Details
Previous empirical evidence has show that the program significantly reduces the number of violent-crime arrests among the treated youth. In addition to replicating the earlier finding on a larger study population, we are interested in isolating the components of the program that are most essential to producing the positive outcomes the City has enjoyed in recent years - e.g. whether reductions in violence have been driven simply by access to work and income, or by the prosocial and behavioral effects of mentoring.

In order to estimate the causal impact of the different program models on our measured outcomes, we randomly assigned eligible program applicants to one of three arms:one that provides a summer job and 24/7 access to an adult mentor as well as 5 hours of mentoring curriculum per week, one that provides the job only, and a control group of no services. Separating the components of the program model into two distinct treatment arms and a control group will allow for a straightforward causal estimate of the effects of the program on the number of violent-crime arrests among treated participants.

In order to study the impact of provider-level characteristics, we employed a block-randomized design to randomly assign applicants to providers and treatment within geographic blocks. Using geocoded address data from the full list of eligible applicants, we designed six blocks based around clusters of providers that were of similar size and reasonable proximity. We also designed the blocks to account for the geographic density of the applicants in relation to the location of provider offices, allowing for an approximately equal probability of treatment across groups while attempting to minimize the average distance traveled by treatment participants.

The block-randomized design will also allow us to examine secondary research questions, including investigating the variation in outcomes across providers. The 19 providers vary in size, operational capacity, and program experience. Exploring the scalability of the OSC program requires greater understanding of the nuances of what characteristics of providers are required for program success.
Randomization Method
Randomization was done in office, by computer.
Randomization Unit
Individual participants were randomized to one of three or four providers and one of two treatment groups or a control within the geographic blocks.




Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
n/a
Sample size: planned number of observations
5444
Sample size (or number of clusters) by treatment arms
All six blocks contain study participants in both treatment arms and the control. The treatment arm breakdown in sample size for the entire study population is as follows:

1) 1258 observed youth were randomized to te treatment arm that paired a job with a weekly mentoring component.

2) 1246 youth were randomized to the treatment arm that only provided a job without the mentoring component.

3) 2940 youth were randomized to the control group of no services.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
In a 2012 study, which recruited a similar population of youth and used identical data sources as the proposed project, baseline covariates explained between 0.1 and 0.5 of the variation in outcomes (less for arrests, more for schooling outcomes). Since we have no reason to suspect that the treatment effect will vary by blocks, we do not account for any variability at the block level. With these assumptions, the overall treatment-control contrast has a minimum detectable effect (MDE) of between 0.05 and 0.08 standard deviations. The across-treatment contrast has an MDE of 0.09 to 0.12 SDs.
IRB

Institutional Review Boards (IRBs)

IRB Name
Social and Behavioral Sciences Institutional Review Board
IRB Approval Date
2015-05-19
IRB Approval Number
IRB15-0491

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials