Inclusive Classrooms and Equitable Student Success: A Faculty Experiment
Last registered on October 06, 2020

Pre-Trial

Trial Information
General Information
Title
Inclusive Classrooms and Equitable Student Success: A Faculty Experiment
RCT ID
AEARCTR-0005505
Initial registration date
February 28, 2020
Last updated
October 06, 2020 4:31 PM EDT
Location(s)
Primary Investigator
Affiliation
Harvard Kennedy School
Other Primary Investigator(s)
PI Affiliation
Harvard University
PI Affiliation
Harvard University
PI Affiliation
Harvard University
Additional Trial Information
Status
In development
Start date
2020-10-07
End date
2020-12-20
Secondary IDs
Abstract
Researchers have documented racial and gender gaps in college enrollment decisions, choice of major, degree attainment, and earnings—despite narrowing gaps in test scores and course-taking in K-12 settings. Implicit racial and gender stereotypes of faculty members may affect their interactions with students and exacerbate these gaps, even without awareness or intent to harm members of underrepresented groups. Yet, there is no causal evidence on the extent to which faculty’s implicit bias contributes to these educational disparities and which types of interventions are cost-effective in mitigating any harmful effects of implicit bias on student achievement gaps.This study aims to address implicit bias of faculty members through the collaboration between psychologists and economists. First, we plan to understand the relationship between faculty’s implicit bias and gaps in student achievement, completion, and economic mobility using a newly constructed dataset with schools’ student-level and faculty-level administrative data, and faculty’s implicit association test (IAT) results. Second, we plan to implement a randomized field experiment to evaluate the effects of faculty implicit bias trainings on students' academic performance. Due to schools’ adjustments to online education in March 2020, we will pilot the study using an online format in Fall 2020 at Reynolds Community College. The pilot will take place between October 7-December 20, 2020.
External Link(s)
Registration Citation
Citation
Banaji, Mahzarin et al. 2020. "Inclusive Classrooms and Equitable Student Success: A Faculty Experiment." AEA RCT Registry. October 06. https://doi.org/10.1257/rct.5505-2.0.
Sponsors & Partners
Sponsor(s)
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2020-10-07
Intervention End Date
2020-12-20
Primary Outcomes
Primary Outcomes (end points)
We intend to evaluate the impact of our intervention on measures of students' academic performance, attainment, attitudes, and mobility.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
The study intends to evaluate how providing implicit bias training to higher education instructors impacts students' outcomes. We will offer---in the fall of 2020---an implicit bias training to a randomly selected sample of instructors teaching courses at the Fall 2020 term in Reynolds Community College. The training is designed to expose faculty members to their own implicit biases and provide them with tools to adjust their automatic pattern of thinking with the ultimate goal of mitigating any biased behavior. This treatment--- implemented by psychologists---will be based on scientific evidence and previous research results and it will adopt a non-judgmental approach that focuses on the recipients’ self-interest and organizational interest. Follow up emails will be sent at most bi-weekly to remind instructors of training content to raise awareness about potentially biased behavior. We will then evaluate the impact of interacting with instructors exposed to training on students' outcomes. Due to schools’ adjustments to online education in March 2020, we will pilot the study using an online format in Fall 2020 at Reynolds Community College, randomizing across instructors from all departments.
Experimental Design Details
Randomization Method
The randomization will be done in office by a computer.
Randomization Unit
We will randomize at the individual---instructor---level.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
For the online pilot at Reynolds community college in Fall 2020, we will stratify based on faculty’s baseline survey completion such that half of the survey completers are assigned to treatment and half are assigned to the control group.
Sample size: planned number of observations
For the online pilot at Reynolds community college in Fall 2020, we will include 328 instructors across all departments in our randomization. We can observe for these instructors across 1,000 classes and we also have information on over 6,000 students.
Sample size (or number of clusters) by treatment arms
For the online pilot at Reynolds community college in Fall 2020, we will have 164 instructors assigned to treatment and 164 assigned to the control group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Using administrative information from previous terms, we were able to simulate the Minimum Detectable Effect considering different measures of students’ performance. Our power calculations suggest that we will be able to detect an impact 3 percent of a standard deviation for outcomes measured at the student-class level (e.g., grade) and from 13 to 18 percent of a standard deviation for outcomes measured at the instructor-class level (e.g., black-white grade gap, hispanic-white grade gap). To compute MDEs, we assumed a significance level of 5 percent and an 80 percent power for the overall treatment.
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Harvard University
IRB Approval Date
2020-02-21
IRB Approval Number
IRB20-0021
Analysis Plan
Analysis Plan Documents
Analysis Plan

MD5: b0c82c8247cc7dbd0da07bfe8b67682b

SHA1: 547bfaad8d5031107a7f66eb18cb646087b04472

Uploaded At: October 06, 2020

Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS