Input incentives, student engagement, and post-secondary success: Experimental evidence from a national college advising program

Last registered on May 17, 2023

Pre-Trial

Trial Information

General Information

Title
Input incentives, student engagement, and post-secondary success: Experimental evidence from a national college advising program
RCT ID
AEARCTR-0011413
Initial registration date
May 12, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 17, 2023, 2:43 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
May 17, 2023, 3:48 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
University of Virginia

Other Primary Investigator(s)

PI Affiliation
University of Virginia

Additional Trial Information

Status
Completed
Start date
2021-09-01
End date
2023-01-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Recent studies find that intensive college advising models can significantly improve college choice and later success among lower-income students (e.g. Barr and Castleman, 2021; Bettinger and Evans, 2019; Castleman, Deustchlander, and Lohner, 2020). However, the impacts are more modest for remote advising programs like the CollegePoint initiative (Sullivan et al, 2021). In an attempt to further improve college choice and success, other programs have included a financial incentive component; the evaluations of these programs show that its the combination of the financial incentives and intensive support services that drive the large impacts (Angrist, Lang, and Oreopolous, 2009, Carrell and Sacerdote, 2017).

In other education contexts, research suggests that incentivizing student inputs (e.g. time spent in a learning module) can be more effective and efficient than incentivizing student outputs (e.g. text scores) (Clark et al, 2020; Fryer, 2011, Hirshleifer, 2021). This pattern of findings is likely due to students having a high degree of control over task-oriented inputs, while facing greater uncertainty about outputs that occur further in the future and not possessing the necessary skills to effectively improve the outcomes on which they are assessed on their own.
In this paper, we test the addition of input-based incentives to an existing remote advising program targeted toward high-achieving, low- to moderate-income high school students. This population is of particular interest due to the high degree of undermatch based on college-quality (Hoxby and Avery, 2013) and the relationship between college quality and social mobility (Chetty et al, 2017).
External Link(s)

Registration Citation

Citation
Bird, Kelli and Ben Castleman. 2023. "Input incentives, student engagement, and post-secondary success: Experimental evidence from a national college advising program ." AEA RCT Registry. May 17. https://doi.org/10.1257/rct.11413-1.1
Experimental Details

Interventions

Intervention(s)
Advising Plus was an input-based incentive program to encourage high-achieving low- and moderate-income students to complete key inputs in the college and financial aid application process, with the goal of improving students' enrollment quality. Advising Plus was implemented by a non-profit national college advising program, CollegePoint. Advising Plus incented students to engage regularly with a remote college advisor; to apply to well-matched colleges and universities; and to review financial aid packages with their advisor. Advising Plus moreover provided students with $500 to defray costs associated with successfully transitioning to their intended college or university, given the large body of research demonstrating the positive effects of additional financial assistance on college enrollment (Dynarski, Page, and Scott-Clayton, 2022) and the more focused literature on financial barriers that arise during the summer after high school that can deter students from following through on their college intentions (Castleman and Page, 2013).
Intervention Start Date
2021-09-01
Intervention End Date
2022-06-30

Primary Outcomes

Primary Outcomes (end points)
The primary outcomes relate to the incentivized behaviors and to students' college enrollment quality. CollegePoint, the organization that implemented Advising Plus, defines enrollment quality as whether students attend a college or university with a six-year graduation rate of at least 70 percent (these are referred to as "CollegePoint schools."). Our primary outcome is therefore whether students enroll at a CollegePoint school in the fall immediately following high school. Other primary outcomes connected to the incented behaviors include: (1) The frequency with which and timing of when students met with a CollegePoint advisor; (2) Whether students report applying to a CollegePoint school; and (3) Whether students reported meeting with someone to review their financial aid award letters.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary outcomes include whether students report acceptances to CollegePoint schools; whether students report their top choice institution (among their acceptance choice set) being a CollegePoint school; and student reports of the topics they discussed with their advisor and how influential their advisor was on their college plans and decisions.

Given prior evidence that undermatch is concentrated among geographically-dispersed high-achieving, low-income students, who have fewer selective institutions close to where they live and less access to college information or advising (Avery and Hoxby, 2012; Hoxby and Turner, 2013), we will investigate potential heterogeneity in the impact of Advising Plus based on students' geographic residence and proximity to selective institutions.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
TBA.
Experimental Design Details
Randomization Method
CollegePoint contracted with Evaluation and Assessment Solutions for Education (https://www.ease-eval.org/) to conduct randomization. Randomization was conducted in Stata.
Randomization Unit
Randomization was conducted at the student level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
N/A
Sample size: planned number of observations
4,815
Sample size (or number of clusters) by treatment arms
1,998 in Advising Plus, 2,817 in Advising Standard
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
MDE = 3.5 percentage points
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Virginia
IRB Approval Date
2022-09-06
IRB Approval Number
5367

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials