Information Complexity and Advice in the Brazilian College Admissions System: A Choice Experiment

Last registered on January 22, 2026

Pre-Trial

Trial Information

General Information

Title
Information Complexity and Advice in the Brazilian College Admissions System: A Choice Experiment
RCT ID
AEARCTR-0017665
Initial registration date
January 15, 2026

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 22, 2026, 6:29 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Lausanne

Other Primary Investigator(s)

PI Affiliation
University of Macau

Additional Trial Information

Status
In development
Start date
2026-01-16
End date
2026-01-24
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We study how information presentation (a simple versus a complex user interface) and advice concerning uncertainty of preliminary cutoffs affect applicants’ program choices in a centralized admissions environment inspired by SISU/ENEM. Participants face a choice of 12 programs: they may apply to up to two programs, and observe the preliminary admission cutoffs over three days. In addition, they observe last year’s cutoff.
Participants are randomized in a 2×2 design. Treatment 1 varies the user interface: a simple interface in which cutoff information is directly visible, versus a complex interface in which cutoff information is hidden and must be accessed by clicking, as in real SISU. Treatment 2 varies the presence of advice. The advice emphasizes intense competition and the uncertainty of final cutoffs, highlighting the risk of relying mechanically on last-day cutoffs.
Participants make a single, one-shot simulated SISU application decision. We measure how UI complexity and advice affect (i) Admission status, (ii) Allocative efficiency, (iii) Actual admission in the SISU system, and iv) Diversification of choices. Allocative efficiency is measured using experimental payoffs derived from admission outcomes.
External Link(s)

Registration Citation

Citation
Bo, Inacio and Rustamdjan Hakimov. 2026. "Information Complexity and Advice in the Brazilian College Admissions System: A Choice Experiment." AEA RCT Registry. January 22. https://doi.org/10.1257/rct.17665-1.0
Experimental Details

Interventions

Intervention(s)
This study features two independent individual-level randomizations implemented in a 2×2 design:
• Treatment 1 (UI complexity):
– Simple UI: Cutoff information is visible directly on each program card.
– Complex UI: Cutoff information is hidden and can be accessed only by clicking on a program.
• Treatment 2 (Advice):
– No Advice: No additional information is provided.
– Advice: Participants receive advice on the screen before final choice emphasizing competition and cutoff uncertainty.
Advice is informational only and does not recommend specific programs or personalized strategies.
Intervention (Hidden)
Intervention Start Date
2026-01-16
Intervention End Date
2026-01-23

Primary Outcomes

Primary Outcomes (end points)
All outcomes are based on the single observed application decision:
• Admission status: Indicator for admission in the simulated market.
• Payoff (efficiency): Total experimental payoff (points) derived from the assigned program.
• Actual SISU admission: Admission in the real SISU system, conditional on data availability.
• Diversification of choices: Difference between final cutoffs of the first and second choices.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
• Cutoff-certainty perceptions, constructed from three Likert-scale items:
– “The last-day cutoff is a reliable predictor of the final cutoff.”
– “Cutoffs can change substantially from one day to the next.”
– “Last year’s cutoff is informative for this year’s admissions.”
Responses are aggregated into an index of cutoff-certainty perceptions.
Other Measures (Controls)
• Stereotypes regarding gender differences in ENEM performance.
• Self-reported confidence in admission to first and second choices.
• Social-norm indices: perceived family and peer pressure, perceived prestige of fields, and perceived popularity of specific programs (e.g., medicine).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Participants from the current SISU centralized college admissions cohort are recruited through social-media advertisements to participate in an online survey experiment administered via Qualtrics. As an incentive for participation, respondents are informed that every 40th participant receives an Amazon voucher with a minimum value of R$200. In addition, participants can earn higher rewards through performance in the incentivized task.
The core task is a simulated admissions decision with up to two program applications and deterministic admissions based on an exam score and program-specific final cutoffs. The simulated market is calibrated using real data from past SISU allocations.
After providing informed consent, participants complete a pre-survey collecting background characteristics (e.g., demographics), self-reported ENEM score, and baseline confidence about their relative position in the ENEM distribution as well as streotzpical belief obout gender composition of the best scorers. Participants are then randomly assigned to one of four conditions: Simple–No Advice, Simple–Advice, Complex–No Advice, or Complex–Advice.
In the main task, participants view a set of 12 anonymized programs and observe the payoff associated with admission to each program. Participants are informed that voucher payoffs, conditional on winning the lottery (1-in-40 probability), increase with the payoff value of the program to which they are admitted. Payoffs of programs range from R$10 to R$100.
In the Simple UI conditions, participants observe a table displaying all programs along with daily cutoffs (Days 1–3) and last year’s cutoff. In the Complex UI conditions, cutoff information is hidden behind clickable panels, and only one program’s information can be viewed at a time; opening a new program hides previously viewed information. This design closely replicates the SISU user interface.
Participants in all treatments submit a single ranked application (up to two programs) within a three-minute time window. No revisions or additional choice rounds are allowed.
In the Advice conditions, participants observe the following advisory message displayed above the submission interface:
“Competition is high and final cutoffs are uncertain. For instance, in 2021, in 42% of programs, the final cutoff ends above the last-day cutoff. When this happens, the median increase is about 5 points, a meaningful jump. In 5% of cases, the increase exceeds 27 points.”
Admissions are determined using the realized final cutoffs from the simulated market.
After completing the task, participants fill out a post-experiment survey.
Experimental Design Details
Randomization Method
All randomizations are conducted at the individual level within Qualtrics using Survey Flow randomizers. Assignment is fully exogenous and balanced across conditions.
Randomization Unit
individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
1000
Sample size: planned number of observations
1000
Sample size (or number of clusters) by treatment arms
Planned number of clusters: Not applicable (individual-level randomization).
Planned number of observations: Target N = 1,000 participants. Recruitment via social media implies uncertainty about realized sample size. Previous campaigns using similar methods yielded approximately 3000 responses in France (2021). Given the larger Brazilian applicant pool, we aim for substantially higher enrollment.
Data collection continues until the SISU deadline (January 2026). With a 2×2 factorial design, the target implies approximately 250 participants per cell. If realized sample sizes differ, we will report achieved cell counts and conduct all analyses on an intent-to-treat basis.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials