College Summer School: Educational Bene fits and Enrollment Preferences
Last registered on May 15, 2020


Trial Information
General Information
College Summer School: Educational Bene fits and Enrollment Preferences
Initial registration date
May 14, 2020
Last updated
May 15, 2020 2:25 PM EDT
Primary Investigator
University of Arkansas
Other Primary Investigator(s)
PI Affiliation
University of California, San Diego
Additional Trial Information
Start date
End date
Secondary IDs
We examine whether summer school is a missed opportunity for colleges to accelerate completion. We randomly assign summer scholarships to community college students and link their educational outcomes to their preferences for the scholarships. The scholarships have a large impact on degree acceleration, increasing graduation within one year of the intervention by 32% and transfers to four-year colleges by 53%. Treatment effects are concentrated among students with a preference against summer school. Our results suggest that educational impacts do not drive enrollment preferences. And, that many more students could benefit from summer school than the small minority who currently enroll.
External Link(s)
Registration Citation
Brownback, Andy and Sally Sadoff. 2020. "College Summer School: Educational Bene fits and Enrollment Preferences." AEA RCT Registry. May 15.
Experimental Details
In our study, we randomly assigned tuition scholarships for summer courses to community college students. The scholarships had a face value of approximately $400 and could be used to pay for tuition for one summer course of up to three credit-hours (they did not cover other costs such as books, materials, and lab fees). The scholarships were only valid for the immediately following summer term and valid at any Ivy Tech campus.
Intervention Start Date
Intervention End Date
Primary Outcomes
Primary Outcomes (end points)
Graduation with an associate degree and transfer to 4-year colleges
Primary Outcomes (explanation)
These outcomes will be pulled directly from the Ivy Tech administrative database. We will look at these within one year of the intervention and within two years of the intervention
Secondary Outcomes
Secondary Outcomes (end points)
Course completion, enrollment in later semesters, retention into subsequent semesters
Secondary Outcomes (explanation)
These outcomes will be pulled directly from the Ivy Tech administrative database.
Experimental Design
Experimental Design
We conducted the study in two waves: 2016 and 2017. During the Spring 2016 and Spring 2017 terms, our partners identified and recruited currently-enrolled students to participate in our study. Interested students enrolled by completing an online survey. After completing the recruitment, our partners matched the students' survey responses to administrative data containing their academic progress: enrollment, grades, credit accumulation, graduation, transfer, and dropout status. This matching was successful for 121 of 156 students in the 2016 cohort (78%) and 284 of 292 students in the 2017 cohort (97%). We only included matched students in the randomization to scholarships, and thus limit our analysis to these students.

Our experimental sample includes 405 enrolled and matched students across the two cohorts. Based on budget availability, we randomly awarded 69 scholarships in the 2016 cohort (57%) and 97 scholarships in the 2017 cohort (35%).
Experimental Design Details
Randomization Method
The scholarships were assigned using a stratified randomization.

In the 2016 cohort, the randomization strata were: one of five GPA groups, the relative value of summer enrollment (elicited through the enrollment survey), age, and gender. In the 2017 cohort, the randomization strata were: one of three GPA groups, age, and gender. We control for differences in the stratification and assignment ratio by using fixed effects for cohort.
Randomization Unit
Scholarships were assigned at the individual student level
Was the treatment clustered?
Experiment Characteristics
Sample size: planned number of clusters
405 students
Sample size: planned number of observations
405 students
Sample size (or number of clusters) by treatment arms
405 students
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB Name
University of Arkansas Institutional Review Board
IRB Approval Date
IRB Approval Number
Post Trial Information
Study Withdrawal
Is the intervention completed?
Is data collection complete?
Data Publication
Data Publication
Is public data available?
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)