AI Tutor and Student Outcomes

Last registered on August 22, 2025

Pre-Trial

Trial Information

General Information

Title
AI Tutor and Student Outcomes
RCT ID
AEARCTR-0016493
Initial registration date
August 19, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 22, 2025, 5:56 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Arizona State University

Other Primary Investigator(s)

PI Affiliation
Arizona State University
PI Affiliation
Arizona State University
PI Affiliation
Arizona State University

Additional Trial Information

Status
In development
Start date
2025-08-21
End date
2025-12-05
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We evaluate the impact of AI-powered instructional bots on student learning in introductory microeconomics courses at a public university. These conversational bots are designed to teach students core concepts by answering questions, walking them through practice problems, and providing tailored explanations in real time. Each bot is aligned with a specific topic(such as demand, supply, market equilibrium, and elasticity) and is embedded into weekly course modules. The randomized controlled trial assigns students to receive access to the bots for certain topics, while others follow the standard instructional materials alone. We examine effects on students’ course outcomes. The study sheds light on whether AI tools can serve as scalable instructional supplements in economics education, particularly for students who may need additional support outside of class.
External Link(s)

Registration Citation

Citation
Affonso Peyre, Agustina et al. 2025. "AI Tutor and Student Outcomes ." AEA RCT Registry. August 22. https://doi.org/10.1257/rct.16493-1.0
Experimental Details

Interventions

Intervention(s)
Students are randomly assigned to receive access to interactive AI bots designed to teach core microeconomics topics such as demand, supply, and elasticity. The bots provide personalized feedback, topic-specific guidance, and reflective prompts via ASU’s CreateAI platform. Students in the control group receive standard instructional materials only. The intervention is conducted over five weeks.

Students assigned to the treatment group receive a total of five instructional emails, each aligned with a major topic in the course (e.g., demand, supply, elasticity). Each message introduces the relevant AI bot for that topic and includes a guide on how to interact with it via the ASU CreateAI Builder platform. These bots function as conversational tutors, capable of walking students through course content, problem-solving strategies, and real-time feedback. The first Bot is tied to a required course assignment implemented via Canvas Quizzes, which prompt structured interactions with the bots through open-ended questions to help treated students get used interacting with the bot. All students complete this graded assignment . The treatment group uses the bot to complete the assignment; the control group uses regular materials.

In addition to the instructional emails, students in the treatment group also receive bot reminders before major assessments (such as midterms). These reminders encourage students to revisit the bots and review key content.

Students in the control group receive parallel weekly emails that serve as neutral study reminders, directing them to review course material posted on Canvas. Before major assessments, they receive reminders that mirror the treatment group's message in structure and tone but exclude mention of the bots.

All students complete two Qualtrics surveys:
- Baseline Survey: Administered in the first weeks of class, collects demographic background, study routines, work status, AI familiarity, and expectations for the course. This data is also used to stratify the randomization. Randomization is at the average high school GPA level.
- Follow-Up Survey: Administered in the final weeks of the semester. This survey measures self-reported learning, confidence, engagement, perceptions of the bots (for treated students), and other academic behaviors. Both surveys are incentivized with extra credit.
Intervention (Hidden)
Intervention Start Date
2025-09-08
Intervention End Date
2025-12-05

Primary Outcomes

Primary Outcomes (end points)
- Academic performance scores
- Change in beliefs of effectiveness of study strategies (e.g., retrieval practice, spaced study, planning behavior)
- Changes in time allocation to different study habits while studying
- Bot usage analytics (for treated group only)
- Study habits behavior
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
- Perceived usefulness and satisfaction with the chatbot
- Time spent studying per week and cost of studying
- Frequency of bot usage (for chatbot treatment group only)
- Student engagement
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Students enrolled in four introductory microeconomics courses at ASU in Fall 2025 are invited to participate in the study via course announcements and Canvas messages. Following consent and completion of a baseline survey, students are randomized at the individual level using a stratified procedure. Approximately half of the students are assigned to receive access to instructional AI bots aligned with five course topics. The other half receive standard instructional materials only.

Treatment is at the individual level and delivered over five weeks. All students complete a graded assignment aligned with the first bot topic. The treatment group uses the bot to complete the assignment; the control group uses regular materials. Assignments are graded for completion, not correctness, and receive post-assignment feedback through Canvas announcements. Surveys and extra credit incentives are offered to encourage participation. The stratification is at the level of the average high school gpa.

Treatment group students receive five instructional messages (one per bot/topic) and exam reminder emails encouraging bot use. These messages include how-to guides, usage tips, and links to the CreateAI Builder platform.

At the end of the semester, all students are invited to complete a second survey, which captures self-reported learning outcomes, usage of instructional tools, engagement patterns, and (for treated students) evaluation of the bots.
Experimental Design Details
Randomization Method
Randomization is done via STATA in an office by a computer
Randomization Unit
individual level (student)
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
500 students
Sample size: planned number of observations
250 students
Sample size (or number of clusters) by treatment arms
250 control, 250 treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
ERA (Enterprise Research Administration) system
IRB Approval Date
2025-08-12
IRB Approval Number
STUDY00022657

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials