Using Teacher Messaging to Scale Programs: How does targeting messages improve teacher implementation and student comprehension

Last registered on January 02, 2024

Pre-Trial

Trial Information

General Information

Title
Using Teacher Messaging to Scale Programs: How does targeting messages improve teacher implementation and student comprehension
RCT ID
AEARCTR-0012754
Initial registration date
December 26, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 02, 2024, 10:56 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
Center for Education Policy Research at Harvard University

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2023-10-27
End date
2025-03-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Introducing a new curriculum or program often raises a lot of questions. Whether it is a new reading curriculum in New York City or tutoring programs throughout the country to accelerate student learning post-COVID, principals and teachers may resist uprooting old practices and procedures. Most individuals are skeptical of something new even if there is evidence supporting its effectiveness. Thus, there is a need to complement evidence-based solutions with rigorous and research-based policies and procedures to scale a solution within a district.

The Model of Reading Engagement (MORE) equips teachers with standards-aligned and evidence-based tools – lessons, digital activities, and formative assessments of transfer – designed to boost students’ reading comprehension and literacy skills through science and social studies units. In addition, MORE supplies districts with a set of scaling tools to advance equitable access to MORE across the district. These tools focus on three levers of change involving (1) the Summer Leadership Institute (SLI) to train select district leaders and teachers, (2) a district-wide communication and implementation plan developed by district learning communities, and (3) MORE Teacher Innovators (TIs) to implement MORE principles outside of the lesson time, which collectively help build district capacity and buy-in at multiple levels of the system. During the school year the MORE team also provides professional learning and support to suggest adjustments to the communication plan, as needed, and provide feedback to TIs on adaptations.

Our study will assess the implementation and impact of the district-wide communication element of the of the scaling strategy. We will randomize the timing of messages sent to teacher and analyze secondary aggregate classroom data to assess implementation and impact based upon the content of the messages. While these principles that support the tools were developed with the MORE program in mind, many of these strategies could be applied in other contexts.
External Link(s)

Registration Citation

Citation
Scherer, Ethan. 2024. "Using Teacher Messaging to Scale Programs: How does targeting messages improve teacher implementation and student comprehension." AEA RCT Registry. January 02. https://doi.org/10.1257/rct.12754-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Whether you are trying to diffuse hybrid corn in Iowa or a rigorous evidence-based reading curriculum, disseminating the innovation does not guarantee that it will be widely and deeply adopted. Diffusion is the social process that occurs among people in response to learning about an innovation – something new. More than sixty years of research has identified several factors that affect the rate of adoption of innovations (Rogers, 2003). First, it is important to clearly articulate the attributes of the innovation, its: (1) relative advantage, (2) simplicity (e.g., is it easy to learn), (3) compatibility with other tools (e.g., does it complement other programs offered by a district), (4) cost (in time), (5) observability (e.g., can we see the results) and (6) trialability (e.g., your ability to try it before fully adopting it). Empirical work has found that attributes one through three are the most important for wide adoption. While often there is significant deliberation and discussion before any significant change from district leadership, clearly and concisely distilling and communicating these deliberations to all teachers, principals and staff often is not allocated sufficient time. This lack of time can reduce the likelihood that potential solutions are widely and deeply adopted.

A second important result of the research is that most individuals do not evaluate innovations based upon whether the solution is evidence-based. Instead, most people depend upon subjective evaluations of the innovations from people like themselves (Rogers, 2003). This often means that, for example, a teacher who has not adopted a particular innovation would find the recommendation from another teacher far more meaningful than information from an "expert," central office staff, or even a principal. Specifically, opinion leaders similar to a potential adopter can rapidly accelerate adoption and use (Dearing & Cox, 2018). Yet, often these forms of communications are underutilized.

The Model of Reading Engagement (MORE) utilizes a top-down and bottom-up communication strategy focuses on three levers of change involving (1) the Summer Leadership Institute (SLI), (2) a district-wide communication and implementation plan, and (3) MORE Teacher Innovators (TIs) which helps build district capacity and buy-in at multiple levels of the system. This trial focused on the second component, a district-wide communication and implementation plan. MORE’s online platform supports sending personalized messages to teachers so district leaders can leverage a tiered communication strategy: general messages to all employees, behavior science informed scalable and targeted follow-up for those yet to engage with the content (Damgaard & Nielsen, 2018; Jackson & Makarin, 2018), and finally physical visits to schools. This tiered strategy allows leaders to better understand how they can target their limited time and resources.

We randomize the timing of when these messages are sent to test the efficacy of the MORE of implementation and impact. The MORE portal provides real-time data on implementation (e.g., students logins, teacher logins, teacher completion of modules, student progress in activities, completion of lessons, and formative assessments) allowing us to test the efficacy different content of messages using the data to teachers.
Intervention Start Date
2024-01-08
Intervention End Date
2024-06-28

Primary Outcomes

Primary Outcomes (end points)
We will collect rich digital log data on student (digital activities and formative assessments of transfer) and teacher (portal usage) to assess both students and teacher engagement. These data also form the basis for the formative reports we will provide all teachers and district leaders to monitor implementation.
1. Student portal: Engagement with digital activities in the student portal. Behavioural engagement measures will include information on whether and when the student logged-in to the digital activities, total time (how much time they spent engaged with the digital activities), total books completed (a full set of activities) and the proportion of students who complete the formative assessment. Cognitive engagement will include measures of accuracy and reaction speed (e.g., the number and percentage of items students answer correctly, the number of seconds to complete each activity). Finally, students will complete exit tickets upon the completion of each of the 30 lessons, providing detailed information on the exposure to the MORE lessons.
2. Teacher portal: Teacher engagement with the professional learning in the teacher portal. We will collect detailed data on whether teachers login, complete modules for training
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Domain general, computer adaptive word and text reading fluency (Amplify). mCLASS DIBELS. The mCLASS DIBELS assesses several early literacy skills from kindergarten through sixth grade. The K-3 DIBELS assesses the following areas: sound fluency, phoneme segmentation fluency, letter naming fluency, nonsense word fluency, oral reading fluency, and retell abilities (University of Oregon 2018-2020). We will use a composite score that combines subtest scores for end-of-year nonsense word reading fluency (correct letter sounds and whole words read), oral reading fluency, and retell ability.

Domain general reading comprehension, End of Grade (EOG) English language arts and math. The 3rd graders will be eligible to take the North Carolina beginning of grade (BOG) and end of grade (EOG) exam. The North Carolina Beginning-of-Grade3 (BOG3) English Language Arts (ELA)/Reading Tests are linked to the Read to Achieve Program and are aligned to the NC Standard Course of Study (NCSCS) (North Carolina Department of Instruction, 2021). Students read authentic selections comprised of literary and informational selections and then answer related questions. The North Carolina End-of-Grade Tests are designed to measure student performance on the goals, objectives, and grade-level competencies specified in the North Carolina Standard Course of Study.

Domain specific knowledge of vocabulary networks. We developed a 24-item measure to assess third-graders’ science vocabulary knowledge depth. The 24-item semantic association task assesses students’ definitional knowledge of taught science words and their ability to identify relations between the target word and other known words (Kim, Burkhauser, et al., 2021). For example, the science measure includes 7 domain specific words taught in the Grade 3 MORE science lessons (i.e., taught words): skeletal, muscular, nervous, diagnosis, structure, system, function. The task also includes 5 associated words that are not directly taught in the MORE lessons (i.e., untaught words): signal, repair, organ, fracture, sensory. The prompt askes students to “circle all of the words that go with the word signal” and the options includes “metal, messenger, transmit, similar” Each item is scored dichotomously where both correct answer choices must be selected. Similarly, the measure includes 7 domain specific words directly taught only in the treatment condition (astronaut, adventurous, ingenious, voyage, experiment, contributions, equipment) and 5 associated words that were not directly taught (instrument, control, link, commander, orbit).

Domain specific reading comprehension in science. Students will take a 30-item multiple choice tests (with 4 options per item) that assess their ability to read near, mid, and far transfer passages in science. The near transfer passage assesses science content comprehension on a topic taught during the lesson (e.g., scientists studying how monkeys recover from heart attacks), while the mid and far transfer passages have children read content related passage that were not directly taught during the lessons (e.g., how North American migratory birds’ skeletal and muscular systems are adapting, or the anatomy of a skyscraper).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The district will randomize the timing of when schools receives particular messages with different content. The message will be received at least 1-week apart from one another and personalized to teacher using the real-time data available in the portal. Messages will vary by teacher/student participation levels and will focus on information, reminders, and social comparison nudges.
Experimental Design Details
Not available
Randomization Method
The participating district will conduct the randomized controlled trial.
Randomization Unit
The unit of randomization for the main treatment the messages vs. control group conditions (delayed messages) is school. They block by learning community, which is a group of schools organized by region within the district.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
116 schools
Sample size: planned number of observations
~20,000 students
Sample size (or number of clusters) by treatment arms
60 control schools, and 56 treatment schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
A power analysis was conducted using PowerUp! software (Doug & Maynard, 2013) for a three-level multisite cluster randomized trial with treatment assigned at the school level. Students (Level 1) are nested within schools (Level 2), and schools are nested within randomization blocks (Fixed Effect Level 3). Schools will be blocked based on district district's learning communities - groups of schools by region. Prior research was used to identify parameters for the power analysis. Zhu et al. (2012) report between 6 to 13.2% of the variation in student test scores was between elementary schools. They also estimate that school-level covariates account for between 38 to 69% of the between school variance. Therefore, our power analysis assumes the number of students per school after attrition (n = 200 - 100 in 1st and 100 in 3rd), estimate of the school-level proportion of the total variance (ICC = 0.06), 7), a teacher-level proportion of total variance (ICC = 0.06) and an estimate of the proportion of school-level variance explained by the school-level covariates (R2 = 0.51). With a two-tailed test with alpha set at .05, the experiment will be sufficiently powered to detect an effect size of 0.12 with 116 clusters. We expect implementation effects to be larger than the 0.12.
IRB

Institutional Review Boards (IRBs)

IRB Name
HHarvard University-Area Committee on the Use of Human Subjects
IRB Approval Date
2018-08-03
IRB Approval Number
IRB18-1094
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information