The Effects of Course Information on Usage
Last registered on January 22, 2019

Pre-Trial

Trial Information
General Information
Title
The Effects of Course Information on Usage
RCT ID
AEARCTR-0003654
Initial registration date
January 01, 2019
Last updated
January 22, 2019 4:05 PM EST
Location(s)
Region
Primary Investigator
Affiliation
Other Primary Investigator(s)
PI Affiliation
Texas A&M University
PI Affiliation
Texas A&M University
Additional Trial Information
Status
On going
Start date
2018-11-05
End date
2019-12-31
Secondary IDs
Abstract
College students rely on a diverse set of information when making educational decisions. A highly uncharted area in the education literature is what information college students use to make course enrollment and field of study decisions. Our study aims to fill this gap by providing students course grade outcomes of previous courses taught at Texas A&M University in an easy to read graphical interface that allows comparison across classes. To study if students use previous course outcome information we implement a Randomized Control Trial (RCT). To measure the usage of our webpage, we introduce a brief survey asking for a student's Texas A&M email and major and measure the total visits to our webtool page. To examine the effects on academic outcomes we will also look at course tool and major selection in spring 2019 as outcomes.
External Link(s)
Registration Citation
Citation
Lindo, Jason, Jason Lindo and Jonathan Tillinghast. 2019. "The Effects of Course Information on Usage." AEA RCT Registry. January 22. https://doi.org/10.1257/rct.3654-2.0.
Former Citation
Lindo, Jason et al. 2019. "The Effects of Course Information on Usage." AEA RCT Registry. January 22. http://www.socialscienceregistry.org/trials/3654/history/40466.
Experimental Details
Interventions
Intervention(s)
To study the impact access to previous grade information of courses we conduct a randomized controlled trail (RCT) by sending access to a random set of Texas A&M University (TAMU) student emails. The course tool utilizes previous grade information of courses taught at TAMU and displays it in an easy to read fashion using data visualization software. The course tool displays a previous course's average GPA on the y-axis with course number on the x-axis for a professor-course offered in the Spring of 2019. These data points are grouped by department type and allow the user to scroll through a graph window and view GPA's across departments. Hovering over a data point on the graph allows the user to see a courses grade distribution, professor name, course name, and number of q-drops. Furthermore, the tool includes several filters that allow the user to sort information based on GPA, department, core requirement, honors, regular, and whether it meets the International and Cultural Diversity requirement.

In Fall 2018, 5,095 emails were sent to undergraduate students notifying them they have been given access to our new course tool. A follow up email was sent to each student one week after the initial email. Clicking the link in the email directed a student to custom website that hosted the course tool. Upon arriving to the webpage, a student was prompted with a text box asking to enter their TAMU email as well as major (or undeclared if not applicable).
Intervention Start Date
2018-11-05
Intervention End Date
2019-05-01
Primary Outcomes
Primary Outcomes (end points)
Course tool usage
Primary Outcomes (explanation)
We will define and measure course usage in a number of ways as:
• Total number of sign ins and individuals using the tool.
• We will estimate the average treatment effect on the treated for these outcomes. To do this we will match emails used by students to sign in with a list of undergraduate student emails. These outcomes will also be analyzed by a student’s classification and major.
o A student’s classification is defined as being in one of the four following groups: group 1 consists of Undergraduate Non-degree and Freshman (0-29 hours), group 2 consists of Sophomores (30-59 hours), group 3 consists of Juniors (60-89 hours), and group 4 consists of Seniors (90+ hours) and Post baccalaureate Undergraduates.
• Additional analysis will look at total number of sign ins and individual usage by elapsed time from receiving access (i.e., one week post, two week post, etc.) by classification and major.
• We will also construct match ratios of our collected information on sign ins and our initial data file that contains student emails, unique identifying numbers, major, and classification group. This will also be done by classification group and major.
Secondary Outcomes
Secondary Outcomes (end points)
Course and major selection in spring 2019. Class registration completion.
Secondary Outcomes (explanation)
Class registration completion will be measured by calculating how many days it took a student who viewed our course tool to complete their spring 2019 class registration. Course selection will be measured by the department a student’s class schedule is in and the historical average GPA of classes a student registered for spring 2019. Major choice will be measured as the department a student majored, whether they changed their major, and the average GPA (of classes) in the department they are majoring in.
Experimental Design
Experimental Design
We use a complete list of student emails along with major and student unique identifying number to randomly select a 10% sample from each of four grade classifications (classification is based on student credit hour accumulation). A small subset of students (<.5%) were missing an email or student ID and were dropped before randomization. Our final sample size consists of 5,095 undergraduate students.

One week before the Spring 2019 signup period, an email was sent to a selected student advising them that they have been given access to our course tool with a brief description of the tool and link to access it (we also noted in the email that this was part of an IRB approved study). There were four different opening sign up times for students that were based on a classification given by TAMU (classification is a function of credit hour accumulation). Hence, each selected student within a classification group received an email one week before it's opening sign up period. The classification groups are defined as follows: group 1 consists of Undergraduate Non-degree and Freshman (0-29 hours), group 2 consists of Sophomores (30-59 hours), group 3 consists of Juniors (60-89 hours), and group 4 consists of Seniors (90+ hours) and Postbaccalaureate Undergraduates. As a reminder, one week after receiving the initial email a student received a follow up email with the same information as the first email.

Upon clicking the access link in an email, a student would be directed to our custom webpage where we hosted the course tool. Before accessing the tool, a student was asked to enter their TAMU email address and major (or undeclared if not applicable). A student had to complete these fields and click a submit button before each access of the course tool. We placed no limit on the time or number of times a student could access the course tool. As researchers, we can view the emails and majors entered along with a time stamp.
Experimental Design Details
The only restrictions for the login boxes was that text entered in the email field had to contain "@tamu" and the field asking for major could not be blank. There was also no restriction on who could enter this information.
Randomization Method
We randomly select a 10% sample within each of the four grade classification groups a student: Group 1 consists of Undergraduate Non-degree and Freshman (0-29 hours). Group 2 consists of Sophomores (30-59 hours). Group 3 consists of Juniors (60-89 hours). Group 4 consists of Seniors (90+ hours) and Postbaccalaureate Undergrads.
Randomization Unit
We randomized students into treatment within each of the four university classification groups.
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
no clusters
Sample size: planned number of observations
5,095 students in total were sent an email
Sample size (or number of clusters) by treatment arms
The breakdown of 5,095 is as follows: Group 1: 1,102, Group 2: 1,134, Group 3: 1,142, Group 4: 1,717.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
n/a
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Texas A&M University IRB
IRB Approval Date
2019-01-09
IRB Approval Number
IRB2018-1195D
Analysis Plan
Analysis Plan Documents
Analysis Plan

MD5: 13f46cd0f10de7f30ed3a540614211a1

SHA1: b3a02a87850525ff7c23714f50b371e7890ef17b

Uploaded At: January 22, 2019

Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports and Papers
Preliminary Reports
Relevant Papers