The Impact of Peer Diversity on Academic Performance and Majoring in Economics

Last registered on April 13, 2020

Pre-Trial

Trial Information

General Information

Title
The Impact of Peer Diversity on Academic Performance and Majoring in Economics
RCT ID
AEARCTR-0003244
Initial registration date
April 10, 2020
Last updated
April 13, 2020, 12:20 PM EDT

Locations

Region

Primary Investigator

Affiliation
University of California-Los Angeles

Other Primary Investigator(s)

PI Affiliation
Norwegian School of Economics
PI Affiliation
University of Michigan
PI Affiliation
University of Michigan
PI Affiliation
University of Texas, Austin

Additional Trial Information

Status
Completed
Start date
2018-09-06
End date
2019-09-06
Secondary IDs
Abstract
We study the impact of low gender diversity in introductory economics classes on student performance, reported classroom experience, and the subsequent decision to major in economics. We randomize the gender composition of teaching assistant-led sections in introductory economics courses at a large, public university in the U.S. Students are randomly assigned to “more diverse,” “less diverse,” and “status quo” sections in which the fraction of women varies from around 10 to 80%. Main outcomes are students’ grades and future choices including enrollment in other economics courses and declaring economics as a major. We test different mechanisms mediating the treatment effects such as students’ perceptions of the climate in the classroom and heterogeneity by student and section variables.
External Link(s)

Registration Citation

Citation
Angelucci, Manuela et al. 2020. "The Impact of Peer Diversity on Academic Performance and Majoring in Economics." AEA RCT Registry. April 13. https://doi.org/10.1257/rct.3244-1.0
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details

Interventions

Intervention(s)
Students are randomized in sections with more and less gender diversity. Students are surveyed at the beginning and end of the semester and followed in administrative data.
Intervention Start Date
2018-09-06
Intervention End Date
2019-09-06

Primary Outcomes

Primary Outcomes (end points)
Grade performance in ECON 101/2
Attendance of economics classes (both the section for ECON 101/2 and subsequent classes)
Primary Outcomes (explanation)
Our primary outcomes are of two types:
1) Indicators for course final grades as measured in university administrative data
• A- or better range,
• B+ or higher range,
• B- or better range, or
• C or better in the course (the minimum requirement if a student intends to declare the Economic major or complete a business degree).

2) Indicator variable for continuing in economics as measured in university administrative data
• Self-reported section attendance rate as indicated on the exit survey. The survey question reads: “Approximately, what percentage of the TA sections did you attend during the semester? (100 means you attended all TA sections)”
• Indicator for taking any additional econ course beyond Econ 101 or 102; and
• Number of economics courses beyond Econ 101 or 102.

Secondary Outcomes

Secondary Outcomes (end points)
Pursuing and economics major
Declaring an economics major
Graduating with an exconomics major
Secondary Outcomes (explanation)
Our secondary outcomes are
1) Willingness to pursue an economics major as indicated on the exit survey.
The survey question reads: “As of today, how likely is it that you will choose to graduate with an economics major? Choose 100 if you are completely sure you will graduate with an economics major. Choose 0 if you are sure you will not graduate with economics as a major” This variable is coded as a percentage between 0 and 100.

2) Declaring an economics major as measured in university administrative records; and
3) Graduating with an economics major as measured in university administrative records.
Under the hypothesis that more diverse sections improve the performance and experiences of women in economics, we expect to see a positive effect of ShareWomen on each of the secondary outcomes and no effect of Share Women on the secondary outcomes of men.

Experimental Design

Experimental Design
We randomize the gender composition of teaching assistant-led sections in introductory economics courses at a large, public university in the U.S.
Experimental Design Details
We randomize the gender composition of teaching assistant-led sections in introductory economics courses at a large, public university in the U.S. Students are randomly assigned to “more diverse,” “less diverse,” and “status quo” sections in which the fraction of women varies from around 10 to 80%.

After the registrar provided the study team with the name and gender of enrollees in these sections, the study team randomized individuals as follows.

1) In day-time sections with up to 70 students in a time-slot, we randomly assigned women to one Diverse section (35 students max), subject to the constraint that at least 3 women and 3 men be in each section. (This constraint was adopted to avoid extreme situations in which sections were single sex.) After fulfilling this requirement and filling up the Diverse section with the women, the remaining men were assigned to fill the remaining slots in either section.
Example: If there were exactly 35 women out of 70 enrollees in the time slot, the Diverse section would have 32 women and 3 men (91% female). The other section would have 3 women and 32 men (9% female). If there were 25 women enrolled in a time slot, then the share of women in the Diverse section would be 22/35 (62%) and the share of women in the other section would be (3/35, or 9%).

2) In day-time sections with up to 105 students, we first randomly assigned a third of the men and women to a Status Quo section, which then had the same share of women as were enrolled in the time slot. Among the remaining students, we then randomly assigned women to a Diverse section subject to the constraint that at least 3 women and 3 men be in each of the two sections as explained in 1.

Example: If there were exactly 35 women out of 105 enrollees in the time slot, the Status Quo section would be 33% female (35/105 in the time-slot implies 33%*35=12), the Diverse section would have 20 women (35-12-3) for a share of 57% female, and the final section would have 3 women and 32 men (9% female). If there were 25 women enrolled in a time slot, then the share of women in the Diverse section would be 22/35 (62%) and the share of women in the other section would be (3/35, or 9%).
3) The Department of Economics then notified students of their section ID, instructor and classroom before the semester began.
4) After this initial assignment, students could withdraw or enroll in other sections depending on their changing schedules and section openings.

Note that there are two types of variation induced by this intervention. The first variation comes from the randomization in which we assign women to be in the Diverse or StatusQuo sections and the men to the remaining slots. This ensures that the characteristics of women in these sections are no different than the characteristics of women in the other sections. The second source comes from natural variation in the share of women enrolled in a time slot. If, before the intervention is known, fewer women enroll in a time slot than their representation in Economics 101/2 on average, the Diverse section and/or the StatusQuo sections will have a lower share female than in time slots where more women enroll. We expect the share of women enrolled in the Diverse and Status Quo sections to vary considerably due to both sources of variation. Our analysis will use both sources of variation.

However, we do not use variation from potentially endogenous changes in class section. We will only use initial assignment in our analysis. Changes in enrollment resulting from the intervention will be examined as outcomes.
A key advantage of this strategy is that randomization happens behind the scenes at the administrative level, and neither the students nor the teaching assistants leading the sections had any knowledge of the randomization. From the perspective of the students and teaching assistants, the enrollment and section assignment procedure appeared identical to semesters when the experiment was not running. This was a deliberate choice on the part of the study team, so that the study would come as close as possible to representing the causal effect of having more women in one’s section--rather than participant responses induced by awareness of the experiment. Even though the instructors knew about the randomization, they teach to the big lecture group and have no way to identify which students were randomized or not.

We measure student outcomes using two data sources: (1) surveys at the beginning and end of the semester of the intervention and (2) university administrative records.

During the semesters when the interventions run, we invite all students in ECON 101 and 102 to take two surveys “to help the Department of Economics improve the economics curriculum.” Instructors mention these surveys in class and offer students a small amount of extra credit toward the final grade for participating in each survey. In addition, survey respondents are eligible to win five cash prizes of $100 and $50. Instructors have autonomy in the amount of extra credit they choose to award and also how they choose to assign it (with either everyone in the class receiving extra credit if the class as a whole meets a certain participation threshold or awarding extra credit to individuals for survey completion).

The surveys are administered during the first two and last two weeks of the semester and remain open for two weeks. The surveys contain nine modules and last about 15 minutes. The modules cover demographics, major declared/intended, reasons for selecting a particular major, perceptions about the ECON 101/2 class, grade expectations, minimum grade needed to seriously consider economics as a major, math courses previously taken, and experimental questions measuring IQ, risk aversion and competitiveness. Most questions in the initial and end surveys are the same although the latter includes several questions related to the climate students experience in the classroom.
University administrative records from the learning analytics dataset complement survey records. These administrative records are available for students both before and after the intervention, spanning the time students enroll at the university and until the time they graduate. These records include (1) background characteristics from the students and their families, (2) information from the student’s application to the university, (3) all student academic information including class enrollment information, enrolled and earned credits, grades in each class, and overall GPA, and (4) major and minor choices, and earned degrees.

These records are available for all students at the University, regardless of whether the students were enrolled in the intervention terms or intervention sections. As part of the consent process for the surveys, students were asked to consent to linking their survey responses to university administrative records. University administrative records can, therefore, be linked to student survey responses for the subset of students consenting to this linkage.
Randomization Method
Randomization done in an office by a computer
Randomization Unit
Individual students
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
We conducted the intervention and collected survey data from three terms: Winter 2017 (pilot), Fall 2018, and Winter 2019. There were 114 sections (clusters) involved.

Sample size: planned number of observations
3,738 students
Sample size (or number of clusters) by treatment arms
More diverse=1,670 students (52 clusters)
Less diverse=1,730 students (52 clusters)
Status quo=338 students (10 clusters)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Michigan Health Sciences and Behavioral Sciences (HSBS)
IRB Approval Date
2016-12-08
IRB Approval Number
HUM00118784
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials