Studying without distractions? The effect of a digital blackout on academic performance

Last registered on December 10, 2020

Pre-Trial

Trial Information

General Information

Title
Studying without distractions? The effect of a digital blackout on academic performance
RCT ID
AEARCTR-0006378
Initial registration date
September 03, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 03, 2020, 7:28 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
December 10, 2020, 4:09 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
Bocconi University

Other Primary Investigator(s)

PI Affiliation
Bocconi University

Additional Trial Information

Status
In development
Start date
2020-09-07
End date
2021-07-31
Secondary IDs
Abstract
Rising concerns about the effects of technological distractions on concentration and learning outcomes are making us question which are the most efficient ways of studying and using smartphones. In order to investigate this issue, I assign first-year students at Bocconi University to the use of an app that helps them disconnect from distractions on their smartphones. The treatment lasts for several weeks up to the mid-term exams, and through surveys before and after the intervention I aim to detect relevant effects on academic performance, expectations about exam grades, course evaluations, and network influence.
External Link(s)

Registration Citation

Citation
Garbin, Francesca and Pamela Giustinelli. 2020. "Studying without distractions? The effect of a digital blackout on academic performance." AEA RCT Registry. December 10. https://doi.org/10.1257/rct.6378-1.1
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
The objective of this experiment is to test whether using an app for blocking digital distractions is effective or not for improving academic outcomes, and eventually to quantify this effect. Students will use this app for several weeks in order to prompt possible habit changes. In case positive results are obtained, this research will be a valuable instrument to support the future implementation of a wider scheme aimed at all students.
Intervention Start Date
2020-09-21
Intervention End Date
2021-03-12

Primary Outcomes

Primary Outcomes (end points)
The analysis will use as outcome variables: grades and GPA both in the first semester and in the following ones; students' expected and perceived performance before and after taking the exams; students’ appreciation of courses; realized study time, that might be affected by the treatment; other academic expectations. Heterogeneous effects may be assessed thanks to app data on frequency and length of breaks during the digital blackouts, and thanks to network structure position. Treatment effects may be analysed in the form of intention-to-treat analysis.
Primary Outcomes (explanation)
Primary outcomes stem from both survey measures and administrative data, and are combined with the data coming from app usage. Surveys are administered at baseline before the intervention, and after the intervention at different points in time during the semester.

Secondary Outcomes

Secondary Outcomes (end points)
The analysis may be focused on other relevant factors in the education production function and on time usage. Survey questions collect a wide variety of both pre-intervention and post-intervention information that can be exploited in the analysis. Family inputs can be analysed in order to detect heterogeneous patterns. Elements related to socio-economic background, well-being, personality traits, study habits, COVID-19 risk perception, technology use and history, academic expectations, and social networks may be used as well.
Secondary Outcomes (explanation)
The surveys explore many types of outcomes and can be used to construct other relevant variables.

Experimental Design

Experimental Design
I implement an experiment with first-year Bocconi students to study whether distractions coming from smartphones are detrimental to the academic performance. At the beginning of the first semester, I survey eligible students in order to gather general information about their background and their habits, and their willingness to participate in the experiment. Students are randomized into the treatment and are asked to download a blocking app on their smartphones. The app blocks other apps and their notifications (e.g. social media, messaging, news). Students are asked to activate it according to a certain schedule for the four weeks of the experiment. During the experiment, students make the conscious and intentional choice to remain off their phones by using the app.
The intervention will run in the second semester (Spring 2021) as well.
Experimental Design Details
I implement an experiment with first-year Bocconi students to study whether distractions coming from smartphones are detrimental to the academic performance.
At the beginning of the first semester, I survey eligible students in order to gather: demographic information, general behaviors related to study and technological habits, and willingness to participate in the experiment. This is done by means of two brief online surveys that directors of undergraduate bachelor programmes send via email. The email contains also the link to a short video presenting the survey and advertising the possibility to enter a lottery as an incentive device. Participation is voluntary.
At the end of the survey, the willingness to take part into the experiment is assessed. In the survey and when enrolling for the experiment the purpose of the research is stated generally in order not to create expectations or to modify the behavior of control individuals.
The control and treatment groups are randomized according to some stratifying variables, subject to changes depending on the number of respondents. Students attending classes fully or partially on campus will be involved in the experiment with higher priority with respect to permanently off-campus students, given the nature of the research interest.
Students assigned to the treatment are asked to download a blocking app on their smartphones. The app blocks other apps and their notifications (e.g. social media, messaging, news). Students are asked to activate it every weekday from 2pm to 6pm for the four weeks of the experiment. If students are not on campus, either because they are "virtual" students or because they are alternating weeks of physical and online learning, then they are still asked to activate the app from 2pm to 6pm (their own time zone), or, alternatively, in any four-hour window that may suit better their lecture and study schedules. Every day students get a reminder on which they can tap to activate the block. When the block is active, the screen of the smartphone shows a timer; exiting the screen is recorded as a break. Students are free to leave and return to the screen at any time during the 2-6pm window, but they know that this behavior is monitored.
During the experiment, students make the conscious and intentional choice to remain off their phones by using the app, knowing that taking breaks will be monitored. The available data will be only about the usage of the app (e.g. activation of blocks, taking breaks). No other information concerning smartphone usage or location will be accessed.
Before mid-term examinations, all students are asked to complete a short survey regarding their expected performance in the exams, as a tool of self-evaluation of their preparation.
When exams are over, students complete another survey about their ex-post performance expectations (before knowing the grades). Additionally, treated individuals are asked to provide feedback about their experience with the active app block.
All the students are asked to take a final survey at the end of the semester, with contents similar to the first ones in order to assess possible changes in their study and technological habits.
The data for the analysis come from three sources: the described surveys; the app; and administrative records.

Incentives are provided in the form of lotteries. Each lottery randomly assigns Amazon gift cards.
For each survey, students are eligible for the lottery if they answer at least 80% of the questions. For the first two combined surveys, students will be offered one Amazon gift card per class of the value of Euro 40 each. For each of the other three surveys, there will be 10 Amazon gift cards of the value of Euro 25 each.
Using the app will also be incentivized through a lottery. If a student activates the app on at least 80% of the required days (i.e. 16 out of 20) for at least 80% of the total required time, then she will be eligible for one of the 10 Amazon gift cards of the value of Euro 70 each.
No penalty will stem from opting out of the experiment at any time.

In the second semester the same intervention will run. At the beginning of the semester, all first-year students are asked to participate in an initiative that helps them disconnect from their smartphones while studying. This is at the same time more explicit about the goal than the general invitation of the first semester, and it is not directly implying that an app will be offered. Moreover, this invitation is sent directly via email and is not at the end of a survey. I plan to randomize students into two groups: one that gets the app, and another one that gets a "placebo" treatment, i.e. a weekly email that prompts behavior change by offering one or two pieces of advice per week and that challenges students to implement them. The placebo treatment is administered to both groups, so that the only difference between treated and control students lies in the use of the app.
Students assigned to the treatment are asked to repaeat the same intevention of the first semester. They need to download a blocking app on their smartphones and are asked to activate it every weekday from 2pm to 6pm for the four weeks of the experiment. Every day students get a reminder on which they can tap to activate the block. When the block is active, the screen of the smartphone shows a timer; exiting the screen is recorded as a break. Students are free to leave and return to the screen at any time during the 2-6pm window, but they know that this behavior is monitored.
When mid-term exams in March 2021 are over, students complete a survey about their ex-post performance expectations (before knowing the grades). Additionally, treated individuals are asked to provide feedback about their experience with the active app block. Moreover, students who are participating to this semester's intervention but did not take the baseline survey in September are asked to fill in an additional set of baseline questions with the necessary background and habit information.
All the students are asked to take a final survey at the end of the semester to assess possible changes in their study and technological habits and to elicit the willingness to pay for such a distraction-blocking tool.
The data for the analysis come from three sources: the described surveys; the app; and administrative records.
Randomization Method
In the first semester randomization into treatment has not been feasible due to a small amount of participants: randomization would have prevented the eventual detection of statistically significant effects. The analysis on the first semester group ("the pilot") is carried out using propensity score matching.
In the second semester students who agree to participate in the experiment will be randomly divided into two groups. The control and treatment groups will be randomized according to some stratifying variables, possibly subject to changes depending on the number of respondents. Randomization will be done using the statistical software Stata.
Randomization Unit
Randomization is conducted at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
The goal is to have at least several hundred students involved, in order to have hundreds of treated individuals. A maximum of 400 individuals can be treated, as this number corresponds to the number of app licenses currently purchased. In particular, in the second semester a maximum of 310 licenses will be made available.
Sample size: planned number of observations
The design is not clustered, so the number of clusters and the number of observations correspond.
Sample size (or number of clusters) by treatment arms
The goal is to have an almost balanced number of treated and control students.

Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Bocconi Research Ethics Committee
IRB Approval Date
2020-07-21
IRB Approval Number
FA000028

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials