Bias in Online Classrooms

Last registered on April 29, 2021


Trial Information

General Information

Bias in Online Classrooms
Initial registration date
April 28, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 29, 2021, 6:17 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

Stanford University

Other Primary Investigator(s)

PI Affiliation
PI Affiliation
Vanderbilt University
PI Affiliation
UC Irvine

Additional Trial Information

Start date
End date
Secondary IDs
While online learning environments are increasingly common, relatively little is known about classroom interactions in these settings. We test for the presence of race and gender biases among students and instructors in asynchronous online classes by measuring responses to discussion comments posted in the discussion forums of 124 different online courses. Each comment was randomly assigned a student name connoting a specific race and gender.
External Link(s)

Registration Citation

Baker, Rachel et al. 2021. "Bias in Online Classrooms." AEA RCT Registry. April 29.
Experimental Details


Fictive students randomly assigned one of 8 race-gender identities (White, African-American, Chinese, Indian, each with a male or female identity) placed randomly chosen and sequenced comments in the general discussion forums of online courses.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
One outcome domain is a binary indicator for whether the instructor responded to the comment. The second outcome domain is student responses to the comments (i.e., any response and number of responses).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
In each MOOC, we had one of each of our eight race-gender identities place one randomly assigned comment. First, to choose the sequencing of race-gender profiles within each course, we established an initial random ordering of the sequence of the eight race-gender profiles and did so in a manner that ensured that no same-gender or same-race identity appeared consecutively. For the first course in the study, we randomly assigned one of the 16 possible names appropriate for the race-gender identity of each poster. We then randomly assigned a comment to these profiles in this randomly ordered sequence (i.e., 1, 2, 3,..., 8). These 8 initial comments were randomly selected without replacement from the total list of 32 comments. When a second eligible course opened, we randomly selected 8 comments from the remaining pool and assigned them to race-gender profiles in a sequence that was rotated by one position (i.e., 2, 3,...,8, 1). As subsequent courses opened, we randomly selected matched comments until the pool of 32 was exhausted. After every four courses, our procedure returned to the full set of 32 comments. Similarly, we continued rotating the sequence in which race-gender profiles appeared and re-randomized when a full rotation was achieved (i.e., every 8 courses). We also relied on random selection of names without replacement and then re-randomized every 16 times so that names were balanced in the design of the study.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
The unit of randomization is the comment placed within a course.
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
n=992; 8 comments placed within each of 124 courses.
Sample size (or number of clusters) by treatment arms
We identified White male as the control condition and have a sample size of 124 for each of the other 7 conditions.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
Panel on Non-Medical Human Subjects, Stanford University
IRB Approval Date
IRB Approval Number


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Intervention Completion Date
March 31, 2015, 12:00 +00:00
Data Collection Complete
Data Collection Completion Date
March 31, 2015, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials