x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Bias in Online Classrooms
Last registered on April 29, 2021

Pre-Trial

Trial Information
General Information
Title
Bias in Online Classrooms
RCT ID
AEARCTR-0007618
Initial registration date
April 28, 2021
Last updated
April 29, 2021 6:17 AM EDT
Location(s)
Region
Primary Investigator
Affiliation
Stanford University
Other Primary Investigator(s)
PI Affiliation
PI Affiliation
Vanderbilt University
PI Affiliation
UC Irvine
Additional Trial Information
Status
Completed
Start date
2014-06-27
End date
2018-12-31
Secondary IDs
Abstract
While online learning environments are increasingly common, relatively little is known about classroom interactions in these settings. We test for the presence of race and gender biases among students and instructors in asynchronous online classes by measuring responses to discussion comments posted in the discussion forums of 124 different online courses. Each comment was randomly assigned a student name connoting a specific race and gender.
External Link(s)
Registration Citation
Citation
Baker, Rachel et al. 2021. "Bias in Online Classrooms." AEA RCT Registry. April 29. https://doi.org/10.1257/rct.7618-1.0.
Experimental Details
Interventions
Intervention(s)
Fictive students randomly assigned one of 8 race-gender identities (White, African-American, Chinese, Indian, each with a male or female identity) placed randomly chosen and sequenced comments in the general discussion forums of online courses.
Intervention Start Date
2014-08-01
Intervention End Date
2015-03-31
Primary Outcomes
Primary Outcomes (end points)
One outcome domain is a binary indicator for whether the instructor responded to the comment. The second outcome domain is student responses to the comments (i.e., any response and number of responses).
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
In each MOOC, we had one of each of our eight race-gender identities place one randomly assigned comment. First, to choose the sequencing of race-gender profiles within each course, we established an initial random ordering of the sequence of the eight race-gender profiles and did so in a manner that ensured that no same-gender or same-race identity appeared consecutively. For the first course in the study, we randomly assigned one of the 16 possible names appropriate for the race-gender identity of each poster. We then randomly assigned a comment to these profiles in this randomly ordered sequence (i.e., 1, 2, 3,..., 8). These 8 initial comments were randomly selected without replacement from the total list of 32 comments. When a second eligible course opened, we randomly selected 8 comments from the remaining pool and assigned them to race-gender profiles in a sequence that was rotated by one position (i.e., 2, 3,...,8, 1). As subsequent courses opened, we randomly selected matched comments until the pool of 32 was exhausted. After every four courses, our procedure returned to the full set of 32 comments. Similarly, we continued rotating the sequence in which race-gender profiles appeared and re-randomized when a full rotation was achieved (i.e., every 8 courses). We also relied on random selection of names without replacement and then re-randomized every 16 times so that names were balanced in the design of the study.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
The unit of randomization is the comment placed within a course.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
n/a
Sample size: planned number of observations
n=992; 8 comments placed within each of 124 courses.
Sample size (or number of clusters) by treatment arms
We identified White male as the control condition and have a sample size of 124 for each of the other 7 conditions.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Panel on Non-Medical Human Subjects, Stanford University
IRB Approval Date
2014-06-27
IRB Approval Number
30822
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
Yes
Intervention Completion Date
March 31, 2015, 12:00 AM +00:00
Is data collection complete?
Yes
Data Collection Completion Date
March 31, 2015, 12:00 AM +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS