Behavioral considerations in a dynamic matching mechanism

Last registered on November 19, 2024

Pre-Trial

Trial Information

General Information

Title
Behavioral considerations in a dynamic matching mechanism
RCT ID
AEARCTR-0007218
Initial registration date
February 18, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
February 18, 2021, 6:23 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
November 19, 2024, 9:30 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
WZB & DIW Berlin

Other Primary Investigator(s)

PI Affiliation
Université de Lausanne
PI Affiliation
Université de Lausanne

Additional Trial Information

Status
Completed
Start date
2021-02-18
End date
2022-09-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We examine the role of confidence about academic ranking and mechanism knowledge on application behavior for postsecondary education. Therefore, we conduct a survey experiment with participants of the French college assignment system. Before applicants decide on their final application and before universities start to make their offers, we ask for applicants’ preferred order of universities. To study confidence, we ask for their belief about their relative GPA rank, and the treatment group receives information on their true GPA rank. To study mechanism knowledge, we provide another treatment group with information on the rules of the mechanism. We investigate the effects of information provision on application behavior, as well as predicted and actual placement.
External Link(s)

Registration Citation

Citation
Hakimov, Rustamdjan, Renke Schmacker and Camille Terrier. 2024. "Behavioral considerations in a dynamic matching mechanism." AEA RCT Registry. November 19. https://doi.org/10.1257/rct.7218-1.1
Experimental Details

Interventions

Intervention(s)
Intervention (Hidden)
In this experiment, subjects will be randomized into one of four conditions in a 2x2 design: Grade feedback / No feedback x Mechanism information / No information.
In the first dimension, half of the participants will receive feedback about their correct rank in the GPA distribution compared to a reference sample of 1,000 French students. The reference sample was surveyed a couple of weeks before the main survey using the same recruitment channel as the main survey (social media ads) and asked for their most recent GPA. In the main survey, subjects are asked in an incentive compatible way to provide their belief how they rank in terms of GPA compared to the reference sample (which we describe to them).
In the second dimension, half of the participants will answer a multiple-choice question about their understanding of the Parcoursup mechanism and are provided with the correct answer. The correct answer is that offers made by trainings in the process of the procedure cannot be withdrawn from students even if they do not accept them ultimately for a long time. More precisely: "The system is such that you might receive offers in the later stages of the process from a better university. There is no risk of waiting until the end of the procedure and observe all offers that you could get, as universities cannot withdraw the offers." Thus, we emphasize the "safety" of waiting for potentially better offers in the later stages of Parcoursup to address the observation that many students accept an early offer although they may receive better offers later on.
Intervention Start Date
2021-02-18
Intervention End Date
2021-03-12

Primary Outcomes

Primary Outcomes (end points)
We will estimate the impact of the treatment on application behavior as measured by: (1) the prediction of the final assignment, (2) the submitted list on Parcoursup, (3) the final placement, (4) the date of final acceptance, (5) first offer acceptance, and (6) alignment of the submitted list with elicited preferences.
Primary Outcomes (explanation)
(1) Prediction of final assignment: After the intervention, we ask subjects to select their predicted final assignment from the list of trainings that they aim to apply for on Parcoursup in an incentivized way. We measure the rank of the predicted training on the initial rank-order list and selectivity of the predicted training based on the percent of admitted applicants from 2020.
(2) Submitted list on Parcoursup: Using the administrative data, we assess the length of the submitted list (number of applications) and the proportion not exhausting the entire list. Moreover, we measure the selectivity of the submitted list on Parcoursup based on the percent of admitted applicants from 2020.
(3) Final placement: In the administrative data, we measure the rank of the final outcome in the initial survey rank-order list. Moreover, we measure the selectivity of the final placement based on the percent of admitted applicants from 2020.
(4) Date of final acceptance: In the administrative data, we check the date, on which subjects make their ultimate decision for an offer (unconditional and conditional on having pending offers higher on the ROL).
(5) First offer acceptance: In the administrative data, we check how many subjects accept the first offer.
(6) Alignment of the submitted list with elicited preferences: we compare the number of submitted choices to Parcoursup with the list of trainings in the survey.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
In this survey experiment, subjects will be randomized into one of four conditions in a 2x2 design: Grade feedback / No feedback x Mechanism information / No information. In the survey, we ask for participants’ rank order list (ROL) of preferred trainings. We randomize subjects into one of the four treatments and elicit their predicted placement in an incentive-compatible way. Using their personal information, we merge their responses to the administrative Parcoursup data to study their actual application behavior.
Experimental Design Details
We use social media ads to recruit our sample of Parcoursup participants. These recruitment channels are particularly suited for our study since the majority of Parcoursup participants is between 17 and 19 years of age and social media penetration is close to universal in this age group. The ad redirects participants to our Qualtrics survey.
The main survey is preceded by a pre-survey a couple of weeks earlier. In the pre-survey, we ask 1000 students about their most recent GPA (first trimester of the last high school year).
In the main survey, as a first step, we elicit all participants' rank order lists (ROL) of trainings. Therefore, we elicit the list of programs they aim to apply for, as well as their preference for each program relative to their most preferred program (following de Haan, Gautier, Oosterbeek, van der Klaauw 2019). Next, we ask for their most recent trimester grade point average (GPA). To measure applicants’ confidence in their academic achievements relative to others, we elicit beliefs about their percentile rank in the grade distribution in an incentivized way. Afterwards, subjects are randomized into treatments on the individual level. Depending on the treatment, subjects receive information (grade feedback and/or information on the mechanism) or not. Ultimately, we ask them to predict their final assignment in an incentivized way.
In the survey, we ask for participants’ national student number to be able to merge their responses with the administrative data. As we expect that many subjects will not have this number available, we also elicit personal information (name, birth date, school, postal code) to be able to merge their responses afterwards.
In the analysis, we estimate average treatment effects by regressing the outcomes on treatment dummies. To increase precision, we control for baseline characteristics. Moreover, we analyze heterogenous treatment effects by gender, socioeconomic background, type of school certificate, and scholarship status.
Randomization Method
Computerized randomization using Qualtrics.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
We aim for 6,000 individuals but will collect as many respondents as possible during the period, in which the ads are planned (17 February till 11 of March).
Sample size: planned number of observations
We aim for 6,000 individuals but will collect as many respondents as possible during the period, in which the ads are planned (17 February till 11 of March).
Sample size (or number of clusters) by treatment arms
We aim for 1,500 individuals for each of our four treatment arms. We target a rather large sample size as we can only use individuals for the final analysis who we can match to the administrative data.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Based on previous Parcoursup data, we can calculate the following minimum detectable effect sizes (80% power, alpha=0.05). Regarding the days from first offer until final acceptance, we can detect a minimum effect size of 2.1 days. Regarding the percentage of applicants who do not exhaust their application list, we can detect an effect of 5.4 percentage points. Regarding the proportion who accept their first offer, we can detect an effect of 7.2 percentage points. Regarding the outcomes that rely on the elicited preferences, we cannot calculate an effect size.
IRB

Institutional Review Boards (IRBs)

IRB Name
HEC Ethics Committee
IRB Approval Date
2021-01-25
IRB Approval Number
COSTA
Analysis Plan

Analysis Plan Documents

Specification of hypotheses

MD5: 5b99b846dfd64cf1910b945b081c2ad7

SHA1: eecacc0d2bc285c8766459e543262867cc23377f

Uploaded At: February 17, 2021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials