Reducing impact of student bias on feedback surveys: A pilot study

Last registered on April 28, 2021

Pre-Trial

Trial Information

General Information

Title
Reducing impact of student bias on feedback surveys: A pilot study
RCT ID
AEARCTR-0006868
Initial registration date
December 08, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
December 09, 2020, 10:55 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
April 28, 2021, 2:10 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
The Ohio State University

Other Primary Investigator(s)

PI Affiliation
The Ohio State University
PI Affiliation
The Ohio State University
PI Affiliation
The Ohio State University

Additional Trial Information

Status
On going
Start date
2021-01-11
End date
2021-07-31
Secondary IDs
Abstract
This project utilizes a randomized controlled trial to assess the efficacy of utilizing modified introductory language to mitigate implicit bias in student evaluations of instruction. So-called "cheap talk" scripts are presented to survey respondents in advance and describe the hypothetical biases that tend to arise in the study context (Cummings and Taylor, 1999). This is potentially a highly cost-effective strategy to improve the quality of information generated by student evaluations of instruction, while also minimizing inequities for under-represented populations.
External Link(s)

Registration Citation

Citation
Genetin, Brandon et al. 2021. "Reducing impact of student bias on feedback surveys: A pilot study." AEA RCT Registry. April 28. https://doi.org/10.1257/rct.6868-1.2000000000000002
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
All interventions related to the study will be administered by the investigators via the standard online platform for student evaluations of instruction. A subset of student participants will be randomly selected to receive additional introductory text related to the research study prior to completing the usual online evaluation of instruction for their enrolled courses. Instructor participants will not be asked to complete any activities for the study, other than to give their consent to participate. Data collected as part of the intervention will also be linked to student records (race, gender, course grade, major, rank) and instructor records (race, gender) to assess whether the effects of the intervention are heterogeneous across groups.
Intervention Start Date
2021-01-11
Intervention End Date
2021-05-01

Primary Outcomes

Primary Outcomes (end points)
The Student Evaluation of Instructor (SEI) instrument to measure the efficacy of the instructor's teaching. The SEI is broken into ten questions:

1. The subject matter of this course was well organized
2. The instructor is well prepared
3. The instructor communicated the subject manner clearly
4. The instructor was genuinely interested in teaching
5. The instructor was genuinely interested in helping students
6. The instructor created an atmosphere conducive to learning
7. The course was intellectually stimulating
8. The instructor encouraged students to think for themselves
9. I learned a great deal from the instructor
10. Overall, I would rate this instructor as … [Poor, Fair, Neutral, Good, Excellent]

Responses on questions 1 through 9 range from Strongly Disagree to Strongly Agree.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This research will utilize a “survey experiment” design, which involves embedding randomized text within a survey given to subjects. While all subjects will see the same questions (the current questions asked on the OSU Student Evaluation of Instruction), the introductory paragraph will vary across respondents.
Experimental Design Details
Randomization Method
Randomization occurs electronically through STATA.
Randomization Unit
Randomization occurs within each course section.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
Currently one university.
Sample size: planned number of observations
Approximately 6,000 observations.
Sample size (or number of clusters) by treatment arms
Approximately 1,500 observations for each treatment arm.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
The Ohio State University's Behavioral & Social Sciences IRB
IRB Approval Date
2020-04-07
IRB Approval Number
2020B0049
Analysis Plan

Analysis Plan Documents

Pre-Analysis Plan (April 28, 2021)

MD5: 9a9885da2a2dbe93519ddc4120909b9b

SHA1: 264354c8696dca8497011024c49d5fcf05f899a6

Uploaded At: April 28, 2021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials