Back to History

Fields Changed

Registration

Field Before After
Abstract College students rely on a diverse set of information when making educational decisions. A highly uncharted area in the education literature is what information college students use to make course enrollment and field of study decisions. Our study aims to fill this gap by providing students course grade outcomes of previous courses taught at Texas A&M University in an easy to read graphical interface that allows comparison across classes. To study if students use previous course outcome information we implement a Randomized Control Trial (RCT). To measure the usage of our webpage, we introduce a brief survey asking for a student's Texas A&M email and major and measure the total visits to our webtool page. College students rely on a diverse set of information when making educational decisions. A highly uncharted area in the education literature is what information college students use to make course enrollment and field of study decisions. Our study aims to fill this gap by providing students course grade outcomes of previous courses taught at Texas A&M University in an easy to read graphical interface that allows comparison across classes. To study if students use previous course outcome information we implement a Randomized Control Trial (RCT). To measure the usage of our webpage, we introduce a brief survey asking for a student's Texas A&M email and major and measure the total visits to our webtool page. To examine the effects on academic outcomes we will also look at course tool and major selection in spring 2019 as outcomes.
Last Published January 04, 2019 10:40 PM January 22, 2019 04:05 PM
Intervention (Public) To study the impact access to previous grade information of courses we conduct a randomized controlled trail (RCT) by sending access to a random set of Texas A&M University (TAMU) student emails. The course tool utilizes previous grade information of courses taught at TAMU and displays it in an easy to read fashion using data visualization software. The course tool displays a previous course's average GPA on the y-axis with course number on the x-axis for a professor-course offered in the Spring of 2019. These data points are grouped by department type and allow the user to scroll through a graph window and view GPA's across departments. Hovering over a data point on the graph allows the user to see a courses grade distribution, professor name, course name, and number of q-drops. Furthermore, the tool includes several filters that allow the user to sort information based on GPA, department, core requirement, honors, regular, and whether it meets the International and Cultural Diversity requirement. In Fall 2018, 5,095 emails were sent to undergraduate students notifying them they have been given access to our new course tool. A follow up email was sent to each student one week after the initial email. Clicking the link in the email directed a student to custom website that hosted the course tool. Upon arriving to the webpage, a student was prompted with a text box asking to enter their TAMU email as well as major (or undeclared if not applicable). To study the impact access to previous grade information of courses we conduct a randomized controlled trail (RCT) by sending access to a random set of Texas A&M University (TAMU) student emails. The course tool utilizes previous grade information of courses taught at TAMU and displays it in an easy to read fashion using data visualization software. The course tool displays a previous course's average GPA on the y-axis with course number on the x-axis for a professor-course offered in the Spring of 2019. These data points are grouped by department type and allow the user to scroll through a graph window and view GPA's across departments. Hovering over a data point on the graph allows the user to see a courses grade distribution, professor name, course name, and number of q-drops. Furthermore, the tool includes several filters that allow the user to sort information based on GPA, department, core requirement, honors, regular, and whether it meets the International and Cultural Diversity requirement. In Fall 2018, 5,095 emails were sent to undergraduate students notifying them they have been given access to our new course tool. A follow up email was sent to each student one week after the initial email. Clicking the link in the email directed a student to custom website that hosted the course tool. Upon arriving to the webpage, a student was prompted with a text box asking to enter their TAMU email as well as major (or undeclared if not applicable).
Experimental Design (Public) We use a complete list of student emails along with major and student unique identifying number to randomly select a 10% sample from each of four grade classifications (classification is based on student credit hour accumulation). A small subset of students (<.5%) were missing an email or student ID and were dropped before randomization. Our final sample size consists of 5,095 undergraduate students. One week before the Spring 2019 signup period, an email was sent to a selected student advising them that they have been given access to our course tool with a brief description of the tool and link to access it (we also noted in the email that this was part of an IRB approved study). There were four different opening sign up times for students that were based on a classification given by TAMU (classification is a function of credit hour accumulation). Hence, each selected student within a classification group received an email one week before it's opening sign up period. The classification groups are defined as follows: group 1 consists of Undergraduate Non-degree and Freshman (0-29 hours), group 2 consists of Sophomores (30-59 hours), group 3 consists of Juniors (60-89 hours), and group 4 consists of Seniors (90+ hours) and Postbaccalaureate Undergraduates. As a reminder, one week after receiving the initial email a student received a follow up email with the same information as the first email. Upon clicking the access link in an email, a student would be directed to our custom webpage where we hosted the course tool. Before accessing the tool, a student was asked to enter their TAMU email address and major (or undeclared if not applicable). A student had to complete these fields and click a submit button before each access of the course tool. We placed no limit on the time or number of times a student could access the course tool. As researchers, we can view the emails and majors entered along with a time stamp. We use a complete list of student emails along with major and student unique identifying number to randomly select a 10% sample from each of four grade classifications (classification is based on student credit hour accumulation). A small subset of students (<.5%) were missing an email or student ID and were dropped before randomization. Our final sample size consists of 5,095 undergraduate students. One week before the Spring 2019 signup period, an email was sent to a selected student advising them that they have been given access to our course tool with a brief description of the tool and link to access it (we also noted in the email that this was part of an IRB approved study). There were four different opening sign up times for students that were based on a classification given by TAMU (classification is a function of credit hour accumulation). Hence, each selected student within a classification group received an email one week before it's opening sign up period. The classification groups are defined as follows: group 1 consists of Undergraduate Non-degree and Freshman (0-29 hours), group 2 consists of Sophomores (30-59 hours), group 3 consists of Juniors (60-89 hours), and group 4 consists of Seniors (90+ hours) and Postbaccalaureate Undergraduates. As a reminder, one week after receiving the initial email a student received a follow up email with the same information as the first email. Upon clicking the access link in an email, a student would be directed to our custom webpage where we hosted the course tool. Before accessing the tool, a student was asked to enter their TAMU email address and major (or undeclared if not applicable). A student had to complete these fields and click a submit button before each access of the course tool. We placed no limit on the time or number of times a student could access the course tool. As researchers, we can view the emails and majors entered along with a time stamp.
Power calculation: Minimum Detectable Effect Size for Main Outcomes n/a for main outcomes n/a
Secondary Outcomes (End Points) Course and major selection in spring 2019. Class registration completion.
Secondary Outcomes (Explanation) Class registration completion will be measured by calculating how many days it took a student who viewed our course tool to complete their spring 2019 class registration. Course selection will be measured by the department a student’s class schedule is in and the historical average GPA of classes a student registered for spring 2019. Major choice will be measured as the department a student majored, whether they changed their major, and the average GPA (of classes) in the department they are majoring in.
Back to top

Irbs

Field Before After
IRB Approval Date October 18, 2018 January 09, 2019
Back to top

Fields Removed

Analysis Plans

Field Value
Document
analysis+plan.docx
MD5: 4bffe1ef4400a9ccb849eae89a6098a3
SHA1: 2d041da0a0f2de70ab88ac0abe9e0aea261b05d5
Title Analysis Plan
Back to top