Understanding Student Perceptions of College Applications Recommendations

Last registered on September 17, 2024

Pre-Trial

Trial Information

General Information

Title
Understanding Student Perceptions of College Applications Recommendations
RCT ID
AEARCTR-0014368
Initial registration date
September 15, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 17, 2024, 1:49 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
Yale University

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2024-09-15
End date
2026-06-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The study involves a survey of high school students, administered at their schools during regular school hours. The questionnaire will present different scenarios involving hypothetical students applying to college, their initial college portfolio choices, and the recommendations given to them by either a human or algorithmic recommender. Half of the subjects will be randomly assigned a 'human recommender' questionnaire, while the other half will receive an 'algorithmic recommender' questionnaire. The study aims to (1) measure differences in students’ perceptions of college application recommendations when these are provided by a human counselor vs. an algorithm, and (2) identify the mechanisms that drive students’ perceptions of such recommendations.
External Link(s)

Registration Citation

Citation
Monachou, Faidra. 2024. "Understanding Student Perceptions of College Applications Recommendations." AEA RCT Registry. September 17. https://doi.org/10.1257/rct.14368-1.0
Experimental Details

Interventions

Intervention(s)
Half of the students will be randomly assigned a 'human recommender' questionnaire, while the other half will receive an 'algorithmic recommender' questionnaire. We will measure differences in student perceptions of human- vs. algorithm-based recommendations by presenting three hypothetical scenarios about the choices of three fictional students.
Intervention Start Date
2024-09-17
Intervention End Date
2024-10-30

Primary Outcomes

Primary Outcomes (end points)
Recommendation Adoption (Binary): The key outcome variable assesses if the respondent agrees with the given recommendation, or switches back to the original choice of the fictional students.
Primary Outcomes (explanation)
The primary outcome variable will be the recommendation adoption. The binary variable with 1 for adoption of recommendation, and 0 for rejection of recommendation will allow us to assess the average alignment with the recommendation in the two arms of the survey, (i) human recommender, vs. (ii) algorithmic recommender.

Secondary Outcomes

Secondary Outcomes (end points)
- Trust in Intent (Numerical): The variable measures the respondent's trust that the recommender is acting according to persona's best interest.
- Trust in Competence (Numerical): The variable measures the respondent's trust that the recommender is able to make correct and appropriate recommendation.
- Understanding of Rationale (Numerical): The variable measures the respondent's self-assessed understanding of the rationale of the recommendation.
- Compatibility of Goals (Numerical): The variable measures the extent to which the respondent agrees that the recommender has the same goals as that of the hypothetical character for creating the recommendation.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
With the same recommendations for the fictional student in each scenario,
- Control Arm: Participants will be informed that the recommendations for the hypothetical student were made by a (human) career counsellor.
- Treatment Arm: Participants will be informed that the recommendations that the recommendations for the hypothetical student were made by an algorithmic recommendation system.
Experimental Design Details
Not available
Randomization Method
The paper-based questionnaires from both arms are randomized and then distributed to the participants in each classroom.
Randomization Unit
Individual (student)
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
no clusters (1,100 individual students)
Sample size: planned number of observations
1,100 students
Sample size (or number of clusters) by treatment arms
human recommender: 550 students, algorithmic recommender: 550 students
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Yale University Institutional Review Board
IRB Approval Date
2024-08-23
IRB Approval Number
2000038360