Back to History Current Version

A Randomized Evaluation of STEM Focused Summer Programs

Last registered on January 03, 2019

Pre-Trial

Trial Information

General Information

Title
A Randomized Evaluation of STEM Focused Summer Programs
RCT ID
AEARCTR-0002888
Initial registration date
June 28, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 01, 2018, 10:33 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
January 03, 2019, 2:59 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
University of Michigan

Other Primary Investigator(s)

PI Affiliation
Teachers College, Columbia University

Additional Trial Information

Status
On going
Start date
2014-05-15
End date
2019-06-30
Secondary IDs
Abstract
This study estimates the impact of three summer outreach programs for rising high school seniors. The programs are hosted annually by a selective private university that graduates a majority of its students in a science, technology, engineering, or math (STEM) field. The study randomly assigned program applicants between 2014 and 2016 to a control group or one of three summer programs: a six-week, residential program, a one-week residential program, and a program that takes place online over six months. The population targeted by the program is high-achieving but faces one or more potential barriers to accessing selective universities or STEM fields, such as being a first-generation college student, low-income, an underrepresented minority, or from a rural high school, for example. The outcomes studied include college application, admission, and enrollment, and student majors. Outcome data is taken from surveys, the National Student Clearinghouse, and administrative records from 36 selective private universities.
External Link(s)

Registration Citation

Citation
Cohodes, Sarah and Silvia Robles. 2019. "A Randomized Evaluation of STEM Focused Summer Programs." AEA RCT Registry. January 03. https://doi.org/10.1257/rct.2888-2.0
Former Citation
Cohodes, Sarah and Silvia Robles. 2019. "A Randomized Evaluation of STEM Focused Summer Programs." AEA RCT Registry. January 03. https://www.socialscienceregistry.org/trials/2888/history/39872
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2014-05-15
Intervention End Date
2017-01-15

Primary Outcomes

Primary Outcomes (end points)
Outcome data will come from surveys and administrative data from the host institution, the National Student Clearinghouse (NSC), and administrative records from 36 selective private universities.

The host institution office has and will continue to deliver endline surveys to experimental subjects once a semester after the summer programs conclude for that year. These surveys ask students about their academic, career, and self-confidence outcomes. These outcomes include planned college applications, college admissions, knowledge of colleges, knowledge of financial aid, interest in STEM, self-reports of self-efficacy, life, and study skills, planned or current enrollment, planned or current majors, and academic performance.
The host institution office has and will continue to collect administrative data from the host institution, NSC, and administrative records from 36 selective private universities to document student academic outcomes. These outcomes include college application, college admission, college enrollment, persistence, graduation, GPA, and college major.


The host institution office has and will continue to deliver endline surveys to experimental subjects once a semester after the summer programs conclude for that year. These surveys ask students about their academic, career, and self-confidence outcomes. These outcomes include planned college applications, college admissions, knowledge of colleges, knowledge of financial aid, interest in STEM, self-reports of self-efficacy, life, and study skills, planned or current enrollment, planned or current majors, and academic performance.
The host institution office has and will continue to collect administrative data from the host institution, NSC, and COFHE to document student academic outcomes. These outcomes include college application, college admission, college enrollment, persistence, graduation, GPA, and college major.
Primary Outcomes (explanation)
Between surveys and administrative data, there are a large number of outcome variables collected in this analysis. If these were analyzed individually, some would be expected to show significant differences between the programs, solely due to chance. To avoid emphasizing spurious results, outcomes are grouped into related families. Following Anderson (2008), each family is converted into an index according to the following procedure:
• For each individual outcome in the family, define the variable such that higher values are “better."
• Normalize each outcome into a z-score relative to the control group for that cohort. That is, subtract the cohort-specific control group mean and divide by the standard deviation
• Construct the weighted average of all the outcomes in the family. The weight on each outcome is the inverse of the covariance matrix of the outcomes. In the case that a component of the index is missing, the index is calculated as the weighted average of the remaining components
• Program differences are examined using indices as the main outcome measures as opposed to the component variables.
There are indices related to college preferences, knowledge about the college and financial aid application process, confidence, skills, and application strategy. We also construct indices to measure the selectivity of schools the student is admitted to, and the school a student matriculates in. Standalone outcomes are used for application, admission, and enrollment to elite technical schools and the host institution. We will use this outcome data to compare academic and career performance between the experimental groups.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We randomized into four groups of students: three intervention summer programs and one control group. The three programs in the evaluation are a six-week residential program, a one-week residential program, and an online program. The host institution operates these three programs and recruits students between their junior and senior year of high school from all across the United States who face significant challenges to attending college or pursuing a STEM career.
Experimental Design Details
Randomization Method
The control group was created through the random assignment process. The host institution office receives many more applicants than can be served by the programs. Following the host institution office’s normal procedures, 700 students of an applicant pool of about 2,000 were selected each year for the randomization process. The selection committee was made up of stakeholders, community members, and affiliates with longstanding ties to the outreach office. The selection committee ranked applicants in terms of suitability for the six-week program. The outreach office did the final program assignment by lottery after all students were ranked. Selection committee rankings were a holistic reflection of the applicant’s fit for the six-week program. This was assessed based on academic ability and interest in STEM, as indicated through grades, test scores, letters of recommendation, and application essays. In addition, the mission of the program to help traditionally underrepresented populations makes it appropriate to consider the following risk factors during the selection process on a holistic basis though no element in isolation guarantees admissions.

1. the individual would have been the first in the family to attend college;
2. there was an absence in the individual’s family of science and engineering backgrounds;
3. the individual’s high school historically sent less than 50% of its graduates to 4-year-colleges;
4. the applicant attended a school that presented challenges for success at an urban elite university (e.g., rural or predominantly minority);
5. the individual was a member of a group that is under-represented in the study of science and engineering (African American, Latino or Native American).

In addition to each student’s individual characteristics, rankings in 2015 and 2016 were also affected by regional weights designed to increase representation from less dense parts of the country.

In 2014, 653 applicants were ranked by the selection committee. 2014 acted as a pilot year in the sense that the host institution was exploring how the changes in the selection process would affect operations. As a result, nineteen students were admitted to the six-week program through the traditional process. These students are excluded from the experiment, and are labeled as “Certainty Spots.” The 634 students who were not offered a certainty spot were sorted by their rank and placed into one of three blocks based on two cutoffs. Students above the first cutoff were randomly offered admission to either the six-week or the one-week program. Students between the first and second cutoff were offered admission to the one-week or the online program. Finally, applicants below the second cutoff were offered admission to either the online program or no program (control group). The ranking cutoffs for being assigned to block one, two, or three were chosen based on capacity constraints for each program and maintained similar sample sizes for each treatment arm within a block.

In 2015, there were 701 ranked applicants. The randomization design differed in two ways from the previous year: there were only four “certainty spots.” In addition, the number of blocks was reduced to two. Applicants above a single ranking cutoff were randomly assigned to either the six-week, the one-week, or the online program. Applicants below the cutoff were assigned to either the online program or the control group, as in 2014. The reduction in blocks from three to two ensures that less extrapolation is necessary when comparing programs across blocks.

In 2016, there were 749 ranked applicants, and 1 applicant who withdrew his or her application during the randomization process (who we consider to have rejected the program offer). Unlike previous years, there were no “certainty spots” for 2016. For all three cohorts, the program assignment used gender as a randomization. Therefore, program spots were always divided evenly between female and male applicants in each block.

For a comparison of programs to the control group, full randomization would have been ideal, but the modified block designs used in 2014, 2015, and 2016 took into account the outreach office’s programmatic considerations. The outreach office was concerned that the most qualified candidates might have received no program and that relatively less qualified candidates might have received more intensive interventions.

The randomization process is visually represented in Figure 1 in the “supporting documents/materials” section of this registration.




Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
No clusters.
Sample size: planned number of observations
We will analyze the outcomes of about 2,100 students that applied to the program over the course of 3 years.
Sample size (or number of clusters) by treatment arms
six-week program = 250 students, one-week program = 308 students, online program = 472 students, and control = 1,073 students
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
In order to conduct the analysis described above, our sample must be of sufficient size to detect treatment effects. We used findings from Robles’ retrospective study of the six-week residential program as plausible estimates of treatment effect sizes, assuming two cohorts (2014 and 2015). We added a third cohort of randomization ex post. While we had enough power to detect effects in our initial power calculation for cohort 2014 and 2015, the additional year of randomization will increase our precision. Using estimates from the observational study, Robles found a 13.9 percentage point increase in matriculation at the host institution. The control group mean for matriculation at the host institution was 15.4%, with a standard deviation of 36%. With 122 students in the treatment and control groups (61 in each year) and a standard alpha level of 0.05, using a two-tailed test, we have a power level of 86%. Since the standard power level to minimize type II errors is 80%, we are powered to detect effect sizes similar to those found in the observational study. If we use a one-sided test (which assumes the effect of attending the six-week program can only be positive), our power level reaches 92%. This is a plausible assumption given the findings from the retrospective evaluation. With the additional cohort of randomization from 2016, our power will increase further. One concern is that our effect sizes will be smaller, since we will be comparing the six-week program to the one-week program. We are confident that we will still be able to detect effects for two reasons: first, we exceed the 80% threshold so we have some room to decrease power, and second, we will control for block when estimating treatment effects, which will account for some of the variance in outcomes and increase our power.
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
Harvard University – Area Committee on the Use of Human Subjects
IRB Approval Date
2016-04-02
IRB Approval Number
CR-21946-05
IRB Name
Teachers College, Columbia University
IRB Approval Date
2016-04-04
IRB Approval Number
3882
IRB Name
National Bureau of Economic Research, INC.
IRB Approval Date
2016-05-11
IRB Approval Number
FWA #00003692

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials