The Effect of Providing Feedback to Teachers on Focusing Questions: Using Natural Language Processing to Surface Insights to Teachers on Focusing Questions and Measuring the Effect of This Feedback

Last registered on April 13, 2023

Pre-Trial

Trial Information

General Information

Title
The Effect of Providing Feedback to Teachers on Focusing Questions: Using Natural Language Processing to Surface Insights to Teachers on Focusing Questions and Measuring the Effect of This Feedback
RCT ID
AEARCTR-0011258
Initial registration date
April 12, 2023

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 13, 2023, 4:44 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Stanford University

Other Primary Investigator(s)

PI Affiliation
University of Maryland
PI Affiliation
Harvard University
PI Affiliation
Stanford University

Additional Trial Information

Status
Completed
Start date
2022-09-10
End date
2023-04-10
Secondary IDs
Prior work
This trial is based on or builds upon one or more prior RCTs.
Abstract
This project is in collaboration with TeachFX and a research team that specializes in the use of natural language processing in education. This project involves giving feedback to teachers and tutors on their use of focusing questions in the classroom. To do so, it leverages both manual annotation and computational natural language processing techniques. The primary purpose of this study is to measure the effect of this feedback.
External Link(s)

Registration Citation

Citation
Demszky, Dora et al. 2023. "The Effect of Providing Feedback to Teachers on Focusing Questions: Using Natural Language Processing to Surface Insights to Teachers on Focusing Questions and Measuring the Effect of This Feedback ." AEA RCT Registry. April 13. https://doi.org/10.1257/rct.11258-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
This randomized control trial was conducted through the TeachFX platform on 523 teachers from Utah school districts and 21 tutors from an online tutoring company, Cignition. This study involved a weekly email sent to the teachers in the treatment group every Tuesday early morning, with the number of focusing questions they asked in all class recordings in the previous week as well as a display of at most top 3 chosen focusing questions. Focusing questions probe students’ thinking instead of guiding them towards a predetermined answer (Alic et al., 2022).

The email also contains a link to the focusing questions insight page on the TeachFX app and to a resources page (https://medium.com/dorademszky/resources-for-focusing-questions-47bc6cdd9953) explaining what focusing questions are and how to ask more focusing questions to students. Further, it includes the top 3 focusing question starters seen in the week (which are identified across all teachers by the two annotators mentioned above). Below are screenshots of an email that was sent to a teacher.

The treatment group also gets to see the focusing questions they asked on the TeachFX app, while the control group does not have access to this insight.

Detecting focusing questions
The focusing questions are identified by the machine learning model we have built to classify teacher utterances as being instances of focusing questions or not. We have fine-tuned a large language model (BERT) on data from the National Council of Teaching in English (NCTE) - this data has annotations for whether or not a teacher utterance is a focusing question. Given a transcript of a class recording, we extract and pre-process teacher utterances and feed it to this binary classification ML model, which tells us whether or not each utterance is a focusing question. In a given week, for each teacher in the treatment group, we store the list of focusing questions chosen by the model for each recording the teacher made that week.

TeachFX sent this data to a pair of annotators, who label (at most) the top 3 focusing questions for a given teacher, for all teachers in the treatment group who recorded that week. These are the questions we surface in the email.

Raffle
As TeachFX realized that it can be difficult to ensure that teachers record consistently across weeks, TeachFX decided to send raffles to teachers in Utah to incentivize teachers to record more consistently. These raffles involved the opportunity to win Amazon gift cards worth $250. Here are the details for the 4 raffles that TeachFX conducted:

Raffle 1: Any teacher that recorded at least once per week between October 10th, 2022 and November 10th, 2022 would have the opportunity to win a $250 Amazon gift card.
Raffle 2: Any teacher that recorded at least 5 times between December 5th, 2022 and December 23rd, 2022 would have the opportunity to win a $250 Amazon gift card.
Raffle 3: Any teacher that recorded at least once per week between February 6th, 2023 and February 17th, 2023 would have the opportunity to win a $250 Amazon gift card.
Raffle 4: Any teacher that recorded at least once per week between February 27th, 2023 and March 10th, 2023 would have the opportunity to win a $250 Amazon gift card.

Other Emails Encouraging Teachers to Record Consistently:
TeachFX also sent out a ‘Words you used the most’ email at the end of October to all Utah teachers who had recorded that month. For each teacher, this email displayed the most frequent words spoken by the teacher and their students (through teacher and student word clouds for the month) in the past month. It also included a reminder about the October-November raffle time period that would end mid November. It encouraged teachers that they were on track to be considered for the raffle prize if they continued to record every week. This email also demonstrated to teachers the fact that they could receive similar insights if they continued to record consistently.


Post-intervention procedures
After any teacher completes 5 weeks of recordings, we consider the research study complete for the teacher and stop sending emails to that teacher. Following this, TeachFX sent the teacher a survey. This survey has the following questions: Teacher survey for focusing RCT - v2

Further, in order to incentivize teachers to complete the survey, TeachFX frames the survey email as follows: “Thank you for using TeachFX to reflect on your teaching practice! To learn how to better support teachers and improve our app insights, we are sending you a survey about your background and the teaching practices used in your math/science lessons. This survey will take no more than 5 minutes to complete, and your identity will remain strictly confidential. To show our appreciation for your time, we will send you a $10 Starbucks gift card for completing the survey.”

We also plan to conduct interviews with all teachers who complete the survey in order to understand more about their background, their experience with TeachFX as well as (for the treatment group), the impact of getting feedback on focusing questions on their teaching.
Intervention Start Date
2022-10-10
Intervention End Date
2023-03-10

Primary Outcomes

Primary Outcomes (end points)
Measured for each week for each teacher, after combining each recording for that week:
Rate of focusing questions teachers ask per hour
Teacher-student talk ratio
Rate of uptake of student contributions per hour
Rate of student reasoning per hour

Number of recordings throughout the study
Average number of recordings per week
Number of weeks of recordings
Number of email opens and clicks for teachers in the treatment group

Teachers’ self-efficacy ratings from the survey
Thinking about your mathematics/science teaching, please indicate your opinion about each of the statements below.
My questions elicit students’ mathematical/scientific thinking and reasoning.
My students talk about their mathematical/scientific ideas.
I pose open-ended questions.
I engage my class(es) in discussion.
I require students to explain their reasoning when giving an answer.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Please refer to Intervention Design above.
Experimental Design Details
Randomization Method
Subjects were randomized algorithmically (coin flop) upon making a recording via TeachFX on or after Oct 10, 2022.
Randomization Unit
Teachers
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
2 (Utah teachers and a tutoring company)
Sample size: planned number of observations
523 teachers from Utah school districts and 21 tutors from an online tutoring company,
Sample size (or number of clusters) by treatment arms
256 teachers in treatment group and 267 in control group for Utah
21 in treatment and 19 in control group for tutoring company
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Stanford IRB
IRB Approval Date
2022-09-20
IRB Approval Number
66094
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Yes
Data Collection Completion Date
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials