Reading with AI: Can Technology Foster Reading Acquisition and Promote Educational Equity?

Last registered on November 19, 2025

Pre-Trial

Trial Information

General Information

Title
Reading with AI: Can Technology Foster Reading Acquisition and Promote Educational Equity?
RCT ID
AEARCTR-0017205
Initial registration date
November 14, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
November 19, 2025, 1:57 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
SciencesPo

Other Primary Investigator(s)

PI Affiliation
Department of Economics, École Normale Supérieure
PI Affiliation
Center for Research on Social Inequalities, SciencesPo
PI Affiliation
Department of Economics, École Normale Supérieure. Paris School of Economics.
PI Affiliation
Psychopathology & Change Process Lab, Paris Lumières University
PI Affiliation
Paris School of Economics

Additional Trial Information

Status
On going
Start date
2024-09-01
End date
2030-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
While educational technology (edtech) has been the subject of numerous studies and shows positive effects on different outcomes, evidence regarding digital tools for learning how to read in first grade remains limited. In this study, we investigate whether the use of a gamified digital platform developed by an edtech company can successfully improve first-grade students' French language skills by targeting their proximal development zone, in particular low-performing students.

During the intervention, teachers receive a one-hour training to use the platform and free access to the most recent version of the tool. They are encouraged to use it 40 minutes per week for all students in first grade. When they connect for the first time to the platform, students need to answer an initial test, so that the algorithm behind the tool forms a prior about their academic skills. Once this first test is completed, students face exercises adapted to their skills: neither so hard that they would be discouraged to try again, nor so easy that they would not be challenged. And the algorithm keeps updating its estimate of the skills of students. During their use, students receive rewards (stars, stories, progress in a fictitious world) based on their duration of use and on their academic progress. Thus, the tool aims at both adapting the exercises to students’ needs and developing their motivation and engagement in reading.
External Link(s)

Registration Citation

Citation
Barone, Carlo et al. 2025. "Reading with AI: Can Technology Foster Reading Acquisition and Promote Educational Equity?." AEA RCT Registry. November 19. https://doi.org/10.1257/rct.17205-1.0
Sponsors & Partners

Sponsors

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Teachers in our treated group received a one-hour training about the use of the edtech platform Lalilo. The training emphasized how to create logins for their students, how to monitor their progress through a dashboard, and how to assign lessons to students. During this training, teachers were reminded that they volunteered to use the platform 40 minutes per week with first-grade students and were advised to divide this time between 2 x 20 minutes sessions.

Teachers received by email their logins to access the most recent version of the tool. This access needed a reliable internet connection, as the tool could not be downloaded. Teachers could then create logins for their students. Both students and teachers could then login to the platform, either at school (recommended) or at home (neither encouraged nor prohibited), on digital devices. We asked teachers to access the tool either through computers or through tablets with their students.

During their first connections, students were asked to complete an initial test, so that the algorithm behind the platform forms a prior about their academic skills. Once this first test is completed, students face exercises adapted to their skills: neither so hard that they would be discouraged to try again, nor so easy that they would not be challenged. And the algorithm keeps updating its estimate of the skills of students. During their use, students receive rewards (stars, stories, progress in a fictitious world) based on their duration of use and on their academic progress (difficulty of exercises), with the former carrying more weight than the latter to encourage perseverance and feeling of competence of all students, even the low-performing ones.
Intervention Start Date
2024-11-01
Intervention End Date
2026-07-01

Primary Outcomes

Primary Outcomes (end points)
academic scores on the standardized French language exam at the beginning of grade 2 (short term outcome) and grades 3-6 (medium- and long-term outcomes). We will focus on the following items: writing, oral and written comprehension. Student scores on performance anxiety (Anderson index), self-efficacy (Anderson index), motivation, student engagement.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
student wellbeing (Anderson index), growth mindset of students (Anderson index), scores in mathematics and reading aloud, score in comparison with peers for students (Anderson index), teacher score in differential instruction (Anderson index), in self-efficacy (Anderson index), in the use of formative assessment (Anderson index), in the use of cooperative practices, in openness to digital tools (Anderson index), growth mindset of teachers (Anderson index).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We partnered with an edtech company well-implemented in France, Lalilo. Our intervention consisted in randomizing the timing at which volunteer teachers would receive free access to the edtech platform as well as training and support for this use. Teachers could be either randomized to receive access in academic year 2024-2025 or 2025-2026.

Importantly, not all teachers in France were eligible to be part of this experiment. They needed to fulfill the following requirements
- The teacher can access at least five tablets or computers (or at least one device per four students) that can be used daily, together with an internet connection.
- The teacher teaches first‑grade students (CP) in the 2024‑2025 school year.
- The teacher does not actively use the platform in their class at the time of application and had no or low use of the platform over the past year.
- The teacher is willing to use the platform frequently and regularly and to comply with the year of treatment indicated by randomization (either the 2024‑2025 or 2025‑2026 school year, depending on the assigned group). It was recommended that each pupil uses the ed‑tech for at least 40 minutes per week, and the training advised two × 20‑minute sessions.

We assessed the eligibility of teachers in partnership with the edtech company, that verified their past use of the platform, and using data collected through a baseline survey completed by teachers before randomization.

Eligible teachers having completed the survey were then randomized in two waves (late October 2024 and November 2024). Wave of randomization depended on the date of completion of the baseline survey. The treatment assignment was clustered at the school level and stratified according to the following characteristics: past use of the edtech tool, above median openness to using digital tools, priority education status, academic region.

The randomization was performed on R software with a randomly drawn seed.
Experimental Design Details
Not available
Randomization Method
Stratified randomization done in office by a computer
Randomization Unit
School
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
223 schools, 258 teachers
Sample size: planned number of observations
2180 students
Sample size (or number of clusters) by treatment arms
113 treated schools, 110 control schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB of the Paris School of Economics
IRB Approval Date
2024-06-24
IRB Approval Number
2024-030
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information