Effects of providing evidence-based information on teacher's educational technology perceptions

Last registered on April 02, 2024

Pre-Trial

Trial Information

General Information

Title
Effects of providing evidence-based information on teacher's educational technology perceptions
RCT ID
AEARCTR-0013267
Initial registration date
March 31, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 02, 2024, 12:48 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation
University of Chicago
PI Affiliation
University of Chicago

Additional Trial Information

Status
In development
Start date
2024-04-01
End date
2024-09-30
Secondary IDs
Prior work
This trial is based on or builds upon one or more prior RCTs.
Abstract
Previous research found evidence that high quality math apps have the potential to increase children's math skills. This project evaluates whether sharing this research-based information with PreK-4th grade teachers increases the likelihood of their recommending math apps to their students for home use. Additionally, we analyze if the fact that other teachers are recommending math apps influences the teacher's decision to do so as well. We designed a factorial survey experiment. Teachers are asked how likely they are, in a scale from 0 to 10, to recommend 4 hypothetical apps, where each app is accompanied by one the following sets of information: (1) just a description of the app, (2) description and evidence-based information, (3) description and the percentage of teachers recommending it, and (4) description, evidence-based information, and percentage of teachers recommending it. In order to be able to interpret the results causally, we randomize which app gets which set of information and the order in which the apps appear.
External Link(s)

Registration Citation

Citation
Bresciani, Daniela, Ariel Kalil and Susan Mayer. 2024. "Effects of providing evidence-based information on teacher's educational technology perceptions." AEA RCT Registry. April 02. https://doi.org/10.1257/rct.13267-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)

Intervention Start Date
2024-04-01
Intervention End Date
2024-05-31

Primary Outcomes

Primary Outcomes (end points)
How likely teachers are to recommend an app
Primary Outcomes (explanation)
The information will come directly from survey questions

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We designed a factorial survey experiment with a within-subject design. Teachers are asked how likely they are, in a scale from 0 to 10, to recommend 4 hypothetical apps, where each app is accompanied by one the following sets of information:
(1) A short description of the app
(2) A short description of the app + research-based evidence about its effectiveness
(3) A short description of the app + the percentage of teachers that are recommending it
(4) A short description of the app + research-based evidence about its effectiveness + the percentage of teachers that are recommending it

The set of information for each app and the order in which they appear will be randomized.
Experimental Design Details
Not available
Randomization Method
Randomization done by survey platform (Qualtrics)
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
1000 teachers
Sample size: planned number of observations
1000 teachers
Sample size (or number of clusters) by treatment arms
1000 teachers in the same treatment (within-subject design)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Social and Behavioral Sciences Institutional Review Board
IRB Approval Date
2024-02-26
IRB Approval Number
IRB24-0151
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information