Back to History Current Version

My Professor Cares: Experimental Evidence on the Role of Faculty Engagement

Last registered on May 21, 2020

Pre-Trial

Trial Information

General Information

Title
My Professor Cares: Experimental Evidence on the Role of Faculty Engagement
RCT ID
AEARCTR-0005875
Initial registration date
May 20, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 21, 2020, 1:33 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
University of California, Davis

Other Primary Investigator(s)

PI Affiliation
University of California, Davis

Additional Trial Information

Status
On going
Start date
2016-01-04
End date
2020-06-01
Secondary IDs
Abstract
Despite a growing body of literature that instructors “matter” in higher education, there is virtually no evidence about how their actions influence student outcomes. In this paper we provide experimental evidence on the impact of specific faculty behaviors aimed at increasing student success. We test the effect of professor feedback on student success in higher education classrooms though a "light-touch" randomized intervention. We present results from a small pilot in an introductory-level microeconomics course at a comprehensive research university, and the scale-up conducted in over 30 classrooms and nearly 4,000 students at a large broad-access university. The intervention consisted of several strategically-timed E-mails to students from the professor indicating keys to success in the class, the professor’s knowledge of the students’ current standing in the course, and a reminder of when the professor is available. Results from the pilot show that students in the treatment group scored higher on exams, homework assignments, and final course grade, compared to students in the control group. Results from the larger experiment at a broad access institution are more mixed—we find significant positive effects on student perceptions of the professor and course for all students. However, we only find positive achievement effects for our target population, first year students from underrepresented minority groups. Finally, we replicated the pilot to test the robustness of these results and again find positive effects on student achievement at the large comprehensive university. We conclude that in certain settings and with some students, targeted feedback from professors can lead to meaningful gains in achievement.
External Link(s)

Registration Citation

Citation
Carrell, Scott and MIchal Kurlaender. 2020. "My Professor Cares: Experimental Evidence on the Role of Faculty Engagement." AEA RCT Registry. May 21. https://doi.org/10.1257/rct.5875-1.1
Sponsors & Partners

Sponsors

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We test the effect of increased and individualized professor feedback on student success. The “light-touch” intervention consisted of strategically-timed personalized e-mails to students from the professor indicating the professor’s knowledge of the students’ current standing in the course, keys to success in the class, and a reminder of when the professor is available.
Intervention Start Date
2017-01-15
Intervention End Date
2017-12-22

Primary Outcomes

Primary Outcomes (end points)
Course grade, percentage of points earned in the course, and course passing.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
To assess plausible mechanisms of treatment effects, we collected data on student perceptions of the professor and course. These included whether the professor was approachable, available, and cared as well as how well the student felt supported and informed.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The study was implemented in two separate waves during the spring of 2016 and fall of 2017 at a large representative, broad-access four-year institution. We randomly chose 30 large undergraduate courses (serving over 120 students) in each respective term and identified the instructor of record. Collaborating with the Campus Center for College and Career Readiness we recruited these professors by sending personalized letters signed by both the Provost and Dean of Undergraduate Studies. In total, 34 faculty members across 18 different course subjects participated in the study, with nearly 4,000 total students in the treatment and control groups.

In the spring of 2016 the treatment group comprised of a randomly selected one-half of the students in 14 large undergraduate classes. To assess whether spillovers may be biasing our estimates, in the Fall of 2017, we drew our treatment group in two different ways. In eight large classes (>120 students) we randomly selected one-third of students into treatment (in contrast to the half randomized in the first wave). Additionally, we randomly selected the entire class to receive treatment in the ten cases where the professor taught two sections of the identical course. Second, rather than providing two targeted emails as was done in the pilot (at a 10-week quarter system course), we chose to have three targeted emails in the scale-up (at 16-week semester system courses). During spring of 2016, the first email entailed an initial “welcome to my class” message containing strategies to succeed in the course. The second and third emails were targeted performance feedbacks at the midway point in the course and just before the final exam. In the fall of 2017, similar to the pilot, we asked professors to give students in the treatment targeted feedback based on the first “meaningful” assignment as well as midway through the course and just before the final exam.
Experimental Design Details
Randomization Method
For the large classes, students were randomly selected within the class using a random number generator in MS Excel. For the ten cases where the professor taught two sections of the identical course, we randomly selected one class using a computer generated "coin flip".
Randomization Unit
Randomization was done within classrooms for the large classes and across classrooms for the ten cases where the professor taught two sections of the identical course.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
43 classrooms
Sample size: planned number of observations
3,930 students
Sample size (or number of clusters) by treatment arms
2,121 students in control, 1,809 treatment, 43 classrooms, 27 courses subjects, and 34 different instructors.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
We conducted several power analyses prior to implementation. For example, if the baseline course passing rate is 80%, at the 5% level of statistical significance, we would be able to detect treatment effects of 0.078 percentage points or greater. For course performance, we will be able to detect even small treatment effects. For example, at 14 percentage point standard deviation in course grades, we will be able to detect effect sizes as small as 3.195 percentage points.
IRB

Institutional Review Boards (IRBs)

IRB Name
University of California, Davis
IRB Approval Date
2016-01-04
IRB Approval Number
808387-1
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Yes
Program Files URL
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
We provide experimental evidence on the impact of specific faculty behaviors aimed at increasing student success for college students from historically underrepresented groups. The intervention was developed after conducting in-person focus groups and a pilot experiment. We find significant positive treatment effects across a multitude of short- and longer-run outcomes. Specifically, underrepresented students in the treatment report more positive perceptions of the professor and earned higher course grades. These positive effects persisted over the next several years, with students in the treatment more likely to persist in college, resulting in increased credit accumulation and degree completion
Citation
Scott E. Carrell & Michal Kurlaender AMERICAN ECONOMIC JOURNAL: ECONOMIC POLICY VOL. 15, NO. 4, NOVEMBER 2023 (pp. 113-41)

Reports & Other Materials