The Tailoring Premium: How AI Design Unlocks Student Engagement and Learning

Last registered on November 19, 2025

Pre-Trial

Trial Information

General Information

Title
The Tailoring Premium: How AI Design Unlocks Student Engagement and Learning
RCT ID
AEARCTR-0015266
Initial registration date
January 23, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 27, 2025, 8:29 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
November 19, 2025, 11:54 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
KU Leuven

Other Primary Investigator(s)

PI Affiliation
KU Leuven

Additional Trial Information

Status
Completed
Start date
2025-01-30
End date
2025-04-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We provide the first causal estimate of the ``tailoring premium'' for educational AI. In a randomized trial with 2,440 secondary students, we demonstrate that offering a curriculum-tailored chatbot increases immediate learning by 0.126 standard deviations, while a generic chatbot produces learning gains that are statistically indistinguishable from traditional instruction. The learning gain is driven by a 13.2 percentage point increase in module completion, demonstrating that the tool's value comes from solving student engagement. For compliers—students induced to complete the module by the tailored design—the effect is larger and more durable, increasing long-term knowledge retention by a significant 0.309 standard deviations. The returns to educational AI depend critically on its design, not just its availability.
External Link(s)

Registration Citation

Citation
De Witte, Kristof and Jaime Polanco-Jimenez. 2025. "The Tailoring Premium: How AI Design Unlocks Student Engagement and Learning ." AEA RCT Registry. November 19. https://doi.org/10.1257/rct.15266-1.1
Experimental Details

Interventions

Intervention(s)
Traditional Instruction: Students receive standard financial literacy instruction using traditional methods and materials.
Group 1 (Control Group). These students receive a classic learning path with instructions and use a spreadsheet or calculator.
Group 2 (Treatment Group 1). This group receives a reduced learning path where they have to find the answers in a general-purpose AI.
Group 3 (Treatment Group 2). This group follows instructions from a tailored-AI chatbot that adjusts the questions and instructions to the answers of the students.
Intervention (Hidden)
Intervention Start Date
2025-01-30
Intervention End Date
2025-04-30

Primary Outcomes

Primary Outcomes (end points)
Learning Performance
Attitude and Motivation
Learning Experience & User Experience
Self-Confidence & Self-Efficacy
Primary Outcomes (explanation)
Learning Performance: Learning performance will be measured using pre- and post-tests designed to assess students' understanding of key financial literacy concepts, including tax systems. The tests will consist of multiple-choice questions, problem-solving exercises, and case studies. The pre-test and post-test will use the same questions
Attitude and Motivation: Attitude and motivation towards financial literacy will be measured using a validated survey instrument administered before and after the intervention. The survey will assess students' interest in financial topics, their perceived importance of financial knowledge, and their motivation to learn more.
Learning Experience & User Experience: Students will be asked a series of likert questions about their individual experiences with learning. Researchers will also test user experience via heatmaps, session durations, and other data from the technology.
Self-Confidence & Self-Efficacy: Self-confidence and self-efficacy related to financial literacy will be measured using a validated survey instrument administered before and after the intervention. The survey will assess students' beliefs in their ability to understand financial concepts, manage their finances, and make informed financial decisions.

Secondary Outcomes

Secondary Outcomes (end points)
Heterogeneities by gender, vocational school, language spoken at home, previous knowledge of AI...
Secondary Outcomes (explanation)
Heterogeneities by gender, vocational school, language spoken at home, previous knowledge of AI...: We will look at the interaction between the learning approach and student characteristics to assess for differences in outcomes for different groups.

Experimental Design

Experimental Design
This study uses a randomized controlled trial (RCT) design to evaluate the impact of different learning approaches on financial literacy among secondary school students. Participants will be randomly assigned to one of three groups: a control group receiving standard instruction, a treatment group using general-purpose AI for learning support, and a second treatment group receiving a different form of AI-enhanced instruction. All students will receive instruction on financial literacy topics. Key measures, including learning performance and student experiences, will be assessed at various points during the study.
Experimental Design Details
This study uses a randomized controlled trial (RCT) design. Students will be randomly assigned to one of three groups:
Group 1 (Control Group). These students receive a classic learning path with instructions and use a spreadsheet or calculator.
Group 2 (Treatment Group 1). This group receives a reduced learning path where they have to find the answers in a general-purpose AI.
Group 3 (Treatment Group 2).
All students will receive instruction on financial literacy topics. Learning performance, attitudes, motivation, and learning experiences will be measured before and after the intervention. A follow-up test of learning performance will be administered two months after the intervention.
Randomization Method
Students will be assigned to one of the three learning groups using a random number generator within the LimeSurvey platform. Once a student completes the baseline survey in LimeSurvey, the platform will automatically assign them to a group using a pre-programmed random number sequence, ensuring each student has an equal chance of being assigned to any of the three groups.
Randomization Unit
Individual - Since students are being assigned individually
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
0
Sample size: planned number of observations
Based on the power analysis we expect the participation of more than 732 students.
Sample size (or number of clusters) by treatment arms
244
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
A power analysis was conducted to determine the minimum sample size needed to detect a statistically significant difference between the combined AI-assisted learning groups and the control group on the primary outcome (post-test score). Assuming a medium effect size (Cohen's d = 0.2), an alpha level of 0.05, and a desired power of 0.80, the analysis indicated a required sample size of 732 participants per treatment arm.
IRB

Institutional Review Boards (IRBs)

IRB Name
Toetsing Privacy en Ethiek (PRET)
IRB Approval Date
2024-11-08
IRB Approval Number
G-2024-8468

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials