Back to History Current Version

An Experimental Evaluation of Philadelphia WorkReady

Last registered on October 03, 2022

Pre-Trial

Trial Information

General Information

Title
An Experimental Evaluation of Philadelphia WorkReady
RCT ID
AEARCTR-0002451
Initial registration date
September 21, 2017

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 21, 2017, 1:33 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
October 03, 2022, 12:21 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
University of Michigan

Other Primary Investigator(s)

Additional Trial Information

Status
Completed
Start date
2017-07-03
End date
2022-08-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The City of Philadelphia Mayor’s Office and the Philadelphia Youth Network are partnering to conduct a randomized controlled trial to evaluate the impact of WorkReady, a summer jobs program for disadvantaged youth. Recent evidence from random-assignment studies shows that summer jobs programs in New York City and Chicago dramatically reduce violence involvement among participants, but have small, if any, effects on education and employment. The Philadelphia study is intended to 1) assess how generalizable the prior findings are by testing the crime, employment, and school effects of a different summer jobs program in a new setting (pending data availability), and 2) to better understand mechanisms by expanding tests for program effects to other socially-costly correlates of violence that may also be affected. To study these questions, we will allocate about 1,000 of the 8,000 summer WorkReady slots by lottery. We will track youth in a range of administrative data, as well as collect some supplementary qualitative evidence about youth experiences.
External Link(s)

Registration Citation

Citation
Heller, Sara. 2022. "An Experimental Evaluation of Philadelphia WorkReady." AEA RCT Registry. October 03. https://doi.org/10.1257/rct.2451-5.0
Former Citation
Heller, Sara. 2022. "An Experimental Evaluation of Philadelphia WorkReady." AEA RCT Registry. October 03. https://www.socialscienceregistry.org/trials/2451/history/196855
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
The Philadelphia Youth Network (PYN) has offered a youth summer jobs programs called WorkReady for over 15 years. Using a blend of government and private funding, PYN contracts with 50-60 local agencies that implement the six-week WorkReady summer program. Youth ages 14 to 21 apply with a common application either to PYN or to a local agency directly. Program youth are assigned to a local provider, which places them in one of three program models (service learning, work experience, or internship). All three models focus on developing “21st-Century Workforce Skills” and offer an hourly wage.
Intervention Start Date
2017-07-03
Intervention End Date
2017-08-25

Primary Outcomes

Primary Outcomes (end points)
Final outcomes will depend on data availability. Outcomes for which we are pursuing data include: criminal behavior, education engagement and performance, labor market involvement, health, fertility, and household functioning. We are also conducting qualitative interviews and observations with a small number of study subjects to help document program implementation and context, understand the counterfactual condition, and generate hypotheses about mechanisms.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Demand regularly outpaces available resources; last year over 16,000 youth applied for a summer job through PYN, while less than half were awarded a working opportunity. Historically, slots have been allocated either by first-come, first-served policies or by the choice of the provider. For the 2017 program, PYN has committed to a more equitable distribution of a subset of program slots through the implementation of a fair lottery. Youth applicants will be blocked by age and geographic location, then individually randomly assigned to be offered the program or not.
Experimental Design Details
Randomization Method
Randomization will be conducted by the researchers on a computer in an office.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
N/A
Sample size: planned number of observations
We anticipate randomizing around 3,000 applicants for 1,000 program slots.
Sample size (or number of clusters) by treatment arms
Youth will be assigned to different program types, but not randomly. So there is a single treatment arm.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
With 100% take-up, the MDE would be about 0.1. Based on the pilot study, however, we expect the effective take-up rate to be between 30 and 50 percent. If the resulting power is too low, we may repeat the study in summer 2018.
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Pennsylvania Office of Regulatory Affairs, Institutional Review Board
IRB Approval Date
2017-03-09
IRB Approval Number
Protocol 826728
IRB Name
City of Philadelphia Department of Public Health, Institutional Review Board
IRB Approval Date
2017-04-13
IRB Approval Number
2017-02
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
August 31, 2018, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
August 31, 2020, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
4497
Final Sample Size (or Number of Clusters) by Treatment Arms
Treatment N=1,786, Control N=2,711
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Yes
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
This paper combines two new summer youth employment experiments in Chicago and Philadelphia with previously published evidence to show how repeated study of an intervention as it scales and changes contexts can guide decisions about public investment. Two sources of treatment heterogeneity can undermine the scale-up and replication of successful human capital interventions: variation in the treatment itself and in individual responsiveness. Results show that these programs generate consistently large proportional decreases in criminal justice involvement, even as administrators recruit additional youth, hire new local providers, find more job placements, and vary the content of their programs. Using both endogeneous stratification within cities and variation in 62 new and existing point estimates across cities uncovers a key pattern of individual responsiveness: impacts grow linearly with the risk of socially costly behavior each person faces. Identifying more interventions that combine this pattern of robustness to treatment variation with bigger effects for the most disconnected could aid efforts to reduce social inequality efficiently.
Citation
Heller, Sara B. “When Scale and Replication Work: Learning from Summer Youth Employment Experiments” (2022), Journal of Public Economics, 209(104617)
Abstract
Administrative burden reduces the effectiveness of public social programs by deterring take up among adults, but we know little about the role these burdens play in public programs for young people. This paper uses empirical evidence to assess how different barriers shape adolescents’ take-up of summer jobs programs. In a Philadelphia experiment, we find that reminder emails increased application completion by 1.8 percentage points (12.3 percent), with bigger effects from emphasizing short-term monetary gains. In a non-experimental analysis of Philadelphia and Chicago programs, we show that without individualized support during enrollment, disconnected youth are less likely to participate when offered a slot than their more advantaged peers. However, offering universal personalized support during enrollment makes them as or more likely to participate. These findings suggest administrative burden does constrain the benefits of public spending on youth programs and that reducing burden can increase gains from social programs for young people.
Citation
Bhanot, Syon & Sara B. Heller. “Does Administrative Burden Deter Young People? Evidence from Summer Jobs Programs” (2022), Journal of Behavioral Public Administration, 5(1)

Reports & Other Materials