AI in Higher Education: Experimental Insights into Motivational and Cognitive Barriers

Last registered on April 03, 2025

Pre-Trial

Trial Information

General Information

Title
AI in Higher Education: Experimental Insights into Motivational and Cognitive Barriers
RCT ID
AEARCTR-0015659
Initial registration date
March 28, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 03, 2025, 12:36 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
Utrecht University

Other Primary Investigator(s)

PI Affiliation
Swansea University

Additional Trial Information

Status
In development
Start date
2025-03-20
End date
2030-06-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
This project explores how generative AI can support personalized learning and whether it affects student motivation and performance gaps. While AI has the potential to boost learning, many students remain skeptical, especially with creative tasks. These attitudes may differ by background or task type.

The study uses Randomized controlled trials in the field. Participants complete tasks with or without AI support. The goal is to measure how AI affects their performance and motivation in future tasks.

This research will also examine barriers to using AI effectively. It aims to determine whether these barriers vary by gender, SES level, and learning subjects. The results will help design educational tools that make AI more helpful and fair for all students.

External Link(s)

Registration Citation

Citation
Rezaei, Sarah and Bastian Westbrock. 2025. "AI in Higher Education: Experimental Insights into Motivational and Cognitive Barriers." AEA RCT Registry. April 03. https://doi.org/10.1257/rct.15659-1.0
Experimental Details

Interventions

Intervention(s)
Undergraduate students are randomly assigned different versions of a coursework assignment. Some versions include GenAI-generated answer suggestions below numerical questions. The intervention is the provision of these AI-generated suggestions, enabling a comparison between students who receive this support and those who do not.

Intervention Start Date
2025-03-30
Intervention End Date
2030-03-31

Primary Outcomes

Primary Outcomes (end points)
The primary outcomes are (1) students’ performance on the individual coursework assignment, measured by the total mark awarded (out of 100), and (2) students’ motivation, measured through self-reported responses on a short post-assignment survey. These outcomes will be used to evaluate the effect of AI-generated answer suggestions on both academic achievement and student motivation.

Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This randomized controlled trial involves undergraduate students who are assigned different versions of a coursework assignment. Some versions include AI-generated answer suggestions. All students report their expected mark, and those with GenAI support also rate its helpfulness. Coursework data will be matched with administrative records on student background and prior academic performance to analyze outcomes and heterogeneity.
Experimental Design Details
Not available
Randomization Method
Randomization is based on the last digit of each student’s university-assigned student number, which is effectively random. This approach assigns students to one of four coursework versions in a predetermined, non-manipulable manner.

Randomization Unit
The unit of randomization is the individual student. Each student is assigned to a coursework version based on the last digit of their student number, which serves as a randomization mechanism at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Aproximately 1000 individual students (1000 clusters at the individual level).
Sample size: planned number of observations
Approximately 1000 students (one observation per student).
Sample size (or number of clusters) by treatment arms
Approximately 500 students in the control group (no AI-generated suggestions) and 500 students in the treatment group (with AI-generated suggestions), based on random assignment via the last digit of the student number.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Utrecht University- Utrecht School of Economics
IRB Approval Date
2025-03-19
IRB Approval Number
25-001