Evaluation of Practice Makes Perfect Summer Program in New York

Last registered on September 09, 2021


Trial Information

General Information

Evaluation of Practice Makes Perfect Summer Program in New York
Initial registration date
September 07, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 09, 2021, 7:08 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.


There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

University of Notre Dame

Other Primary Investigator(s)

PI Affiliation
University of Notre Dame
PI Affiliation
University of Notre Dame
PI Affiliation
University of Notre Dame
PI Affiliation
Vanderbilt University

Additional Trial Information

On going
Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
This study evaluates the impact of a summer learning program for students from low-income communities in the New York City area implemented by Practice Makes Perfect (PMP). This program aims to address summer learning loss through academic enrichment, tutoring, and mentoring services. Students in the program are grouped by grade level and receive services from hired teachers five hours per day, four days per week. The research team will work with PMP and their partner schools to implement a randomized controlled trial (RCT) to generate rigorous evidence of the program’s impact on academic outcomes. The initial pilot phase of the RCT was implemented in the summer of 2018. This pilot was intended to lay the foundation for a larger-scale impact evaluation.
External Link(s)

Registration Citation

Chapkis, Connor et al. 2021. "Evaluation of Practice Makes Perfect Summer Program in New York." AEA RCT Registry. September 09. https://doi.org/10.1257/rct.8032-1.0
Experimental Details


Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Two indices will be constructed as our primary outcomes: an education index based on test scores and related indicators, and an engagement index based on variables such as school attendance, disciplinary actions, on-time grade promotion, and other related measures.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
1. Administrators at each school provide a list of eligible students to PMP. Eligibility criteria at each school is determined by the
individual school administrators. For each PMP class of 20 students, the school will provide a list of at least 120 eligible students. The
required number of eligible students exceeds the number of available slots in order to account for the fact that some students will not
take-up the offer to participate in PMP and to allow for a sufficiently sized control group. In particular, having a list of at least 120
students allows the take-up rate to be 1-in-3 and still maintain a control group that is the same size as those offered the program. In
that case, 60 students would be offered PMP in order to fill 20 slots. These 60 students would constitute the treatment group, while
the remaining 60 students would be designated as the control group. This master list of eligible students that the school provides to
PMP will include information so that PMP can contact the student’s family if they are selected for a PMP slot. This is the typical
recruitment process employed by PMP in working with schools to fill slots in their summer school classrooms.

2. PMP provides LEO with a de-identified master list of eligible students and the number of available spots for each school. A separate
list will be provided for each PMP class of 20 students. This de-identified list will include a student ID number, gender, age, and a
family ID to identify siblings.

3. LEO will randomly assign the order of students on the master list for each class to create a randomly ordered waitlist. LEO will then
provide PMP with the student IDs for those at the top of this waitlist.

4. PMP, working with schools, will then offer slots in the program to students in sequential order from this waitlist. If PMP has invited
all students on the list, but still has unfilled slots in the program, LEO will provide PMP with additional student IDs from the randomly
ordered waitlist.
Experimental Design Details
Not available
Randomization Method
Upon receiving the lists of eligible students in the appropriate grade levels from each feeder school, the researchers will assign the students to a wait list, specific to each summer school program, in a random order. The research team will provide small batches of ordered student names to PMP staff so they can conduct outreach to recruit students for the program until all spots were filled. Those students at the bottom of the wait list whose names were not transferred to PMP for recruitment comprise the control group.
Randomization Unit
Randomization will occur at the individual student level, although separate random wait lists will be generated by grade level and school.
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
480 students across 4 schools
Sample size: planned number of observations
480 students
Sample size (or number of clusters) by treatment arms
240 students will receive treatment status, and 240 students will receive control status
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
University of Notre Dame
IRB Approval Date
IRB Approval Number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information