Why College X? The Effect of Institutional Attributes and Aid Packages on Student Willingness to Pay

Last registered on January 19, 2024

Pre-Trial

Trial Information

General Information

Title
Why College X? The Effect of Institutional Attributes and Aid Packages on Student Willingness to Pay
RCT ID
AEARCTR-0012427
Initial registration date
January 18, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 19, 2024, 2:19 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Amherst College

Other Primary Investigator(s)

PI Affiliation
Amherst College

Additional Trial Information

Status
In development
Start date
2024-01-12
End date
2024-02-12
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
In this study, we examine motivations underlying where students decide to attend college. We examine the effect of selectivity, diversity, and aid packages, on student willingness to pay (WTP) for college and likelihood of matriculation, using a robust set of demographic information. We use a discrete choice experiment (DCE) programmed in Qualtrics where 400 respondents (consisting of first-year college students) choose between three randomly selected hypothetical institutions over 13 menus (including 2 attention check menus), in collaboration with an online survey company, College Pulse.
External Link(s)

Registration Citation

Citation
Debnam Guzman, Jakina and Carla Mattaliano. 2024. "Why College X? The Effect of Institutional Attributes and Aid Packages on Student Willingness to Pay." AEA RCT Registry. January 19. https://doi.org/10.1257/rct.12427-1.0
Sponsors & Partners

Partner

Type
private_company
Experimental Details

Interventions

Intervention(s)
Survey participants will be randomly shown 13 menus of 3 hypothetical institutions each, plus none of the above. The combinations of institutions shown for each menu are randomized across all participants, with 400 total possible college profiles yielding 10,586,800 possible menu combinations. Participants are also randomly assigned to one of 4 menu order "treatment" groups which changes the order in which the same attributes are displayed in an institutional profile.
Intervention (Hidden)
Intervention Start Date
2024-01-12
Intervention End Date
2024-02-12

Primary Outcomes

Primary Outcomes (end points)
1. Likelihood of being selected
2. Willingness to pay (WTP)
3. Likelihood of matriculation
4. Prioritization (preference) of attributes
Primary Outcomes (explanation)
Trade-off ratios between attributes will be constructed using a likelihood ratio method for WTP and prioritization of attributes.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
I use a discrete choice (i.e. choice-based conjoint analysis) experiment, where 400 respondents complete an online survey in Qualtrics, choosing between three institutional options (plus a “None of the Above” option) over 11 different menus (plus two attention check menus). Each institutional alternative is described by five attributes: admissions selectivity, student diversity rating, financial aid, merit scholarship, and net cost of attendance. I also collect a robust set of demographic information and attitudes from each participant to explore interesting heterogeneity in the results.
Experimental Design Details
I utilize a discrete choice (i.e. choice-based conjoint analysis) experiment, where 400 respondents complete an online survey in Qualtrics, choosing between three institutional options (plus a “none of the above” option) over 11 different menus (plus two attention check menus). Each institutional alternative is described by five attributes: admissions selectivity, student diversity rating, financial aid, merit scholarship, and net cost of attendance. The sample size was calculated using a rule of thumb (Johnson & Orme, 2003) commonly used for discrete choice experiments, which is as follows: N>500c/(ta). Where N = sample size, t = number of choice tasks, a = number of alternatives per choice task, and c = the largest product of levels of any two attributes when considering two-way interactions. For my calculation, N > (500 x 5 x 5)/(11 x 3) → N > 379. To account for participants who must be dropped from the results (e.g. for rushing through the survey, I recruit an additional 5% of respondents, so I need around 398, or 400, participants.

The survey company I am working with is College Pulse, a reputable online polling platform focused on surveying college students. They have built the largest database of college student opinion with over 850K student panelists from 1500 universities in all 50 states. College Pulse handles all participant compensation, and students are provided with variable incentives depending on survey length and drop-off rate.

Based on all possible levels for each attribute (minus a few prohibited combinations, to be as realistic as possible), there are 400 possible hypothetical institutions. For the survey, I create a multiple choice question in Qualtrics with 400 possible answer options, where each option consists of a different hypothetical institution graphic. There is a setting on Qualtrics that allows only three (randomly selected) of those 400 possible answer choices to be shown to each participant.

I include 11 real (non attention task) menus with 3 alternatives per choice task (plus "None of the Above" and a total number of levels across all attributes of 19. This includes 5 levels for selectivity, 4 levels for student diversity rating, 5 levels for aid composition, and 5 levels for net cost. All possible levels are listed below:

1. Admissions Selectivity: 10%, 30%, 50% 70%, 90%
2. Student Diversity Rating: 1 star, 2 star, 2 star, 4 star
3. Aid Offer Composition: No Financial Aid or Merit Scholarship, 100% Financial Aid + 0% Merit Scholarship, 90% Financial Aid + 10% 4. Merit Scholarship, 10% Financial Aid + 90% Merit Scholarship, 0% Financial Aid + 100% Merit Scholarship
4. Net Cost of Attendance: $14,700/year, $34,400/year, $45,600/year, $57,100/year, $84,200/year

I also include a way to check that the order in which attributes are presented in a menu does not affect decisions in a meaningful way (e.g. participants prioritizing the admissions selectivity of the three options simply because this attribute is displayed first). I randomize the order of admissions selectivity and student diversity rating, but these two attributes always appear before the financial aid offer and merit aid offer, the order for which is randomized as well. Net cost of attendance is always listed as the last attribute. These randomizations give four possible permutations, making up the four “treatment” groups in the study. Participants are randomly assigned to one of the four treatment groups at the beginning of the study. The sequence of blocks that respondents are presented with is as follows:

1. Informed Consent
2. Screening
3. College Admissions Portal Simulation
4. Learning Survey Interface
5. Attribute Descriptions
6. Attention Check
7. Choice-Based Conjoint
8. Demographics
9. Attitudes/Opinions Questions on a 7-point scale
10. Exit Survey
Randomization Method
The randomization is done by the Qualtrics system.
Randomization Unit
Randomization is at the individual level for menu order treatment groups and at the menu-individual level for choices.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
400 first-year U.S. college students.
Sample size: planned number of observations
400 first-year U.S. college students.
Sample size (or number of clusters) by treatment arms
There are approximately 100 students per menu order treatment group (4 groups). Given there are 400 institutional profiles, and each of the 400 respondents is shown a total of 33 different profiles, each profile will likely be an average of 33 times across all respondents.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Amherst College IRB
IRB Approval Date
2023-10-19
IRB Approval Number
23-048
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials