Structured Study Time, Self-efficacy, and Tutoring
Esther Duflo (MIT and J-PAL) and Abhijit Banerjee (MIT and J-PAL)
In this evaluation we implemented a series of interventions during the spring 2014 version of the online course “14.73x: The Challenges of Global Poverty.” Through these interventions, we aimed to test various methods that might encourage students to stick with the course and boost performance. These consisted of: (1) a tutoring intervention, where student groups of about 25 students were placed in a tutoring group for the duration of the course, (2) a self-efficacy intervention, where students were randomly shown one of several messages about who is successful in the course to test any impacts on performance among various subgroups, and (3) a regular study time intervention, where students could opt in to commit to up to three regular weekly study times, some of whom were randomly sent enforcement reminders and an indication of how closely they were sticking to their study time. At this point, the experiment is considered to be complete; all data was collected during the 14 weeks of the course and the analysis was completed shortly thereafter. No substantial changes to the research design, timeline, or data collection plan were implemented before or during the interventions. 19,694 students registered for the course, of whom 4,704 were active. 1,148 students earned a certificate by achieving a grade of at least 50%. 1,250 students were active on the forum.
Tutoring: During the second and third weeks of the course, students were able to enter the tutoring lottery. 775 students entered, of whom half were randomly assigned to receive a tutor and placed into one of 14 groups based on scheduling preferences submitted at the time of the lottery. Tutors worked with their groups on a weekly basis for the remaining 10 weeks of the course and hosted a dedicated final exam review. We collected feedback from tutors and tutees at the end of the course.
Self-Efficacy: Self-efficacy messages were displayed to students who completed the entrance survey, which was released during the first week of the course. In an effort to increase participation, we offered a random drawing for one of five signed copies of Poor Economics to all students that completed the entrance survey by a certain date. We considered in our sample 3,834 students who took the entrance survey within the first half of the course. These students were randomly allocated to see either no message or one of three messages: (1) a generic message (2) a message related to females performing well in the course, (3) a message related to non-native English speakers performing well in the course. Of the initial 3,834 students assigned to one of the self-efficacy treatments, 2,872 students, or about 75%, were active at some point during the course.
Regular Study Time: Students who completed the entrance survey within the first half of the course were also randomly assigned to one of four treatment groups: (1) control, (2) option for regular study time, (3) option for regular study time with simple enforcement message, and (4) option for regular study time with reminder emails indicating how many times the student had logged on within 30 minutes of one of their designated study times. 3,510 students were included in this intervention, of whom 2,724 were active at some point during the course. Across the three regular study time groups, 1,240 active students, about 46% of those given the option, opted in to sign up for a regular study time. Reminder emails were sent to 333 students, of whom 254 were active.
Results: For the self-efficacy and tutoring interventions, we examined the treatment effect on eight primary outcomes, including course completion, certificate earned, fraction completed, hours spent, overall grade, final exam grade, attempted final exam, and final exam grade conditional on attempting. We find no treatment effect for either intervention on any of the eight outcomes. We find a positive impact of the gender-based self-efficacy message on female students for seven of eight outcomes, however, none of these are statistically significant.
For the tutoring intervention, we also measured the impact of being assigned to a tutor on intensity of interaction with the staff and the course. Those assigned to a tutoring group were significantly more likely to have any interaction with staff (with tutor or on forum) and any engagement on the forum or interaction with a tutor, as we would expect since only those assigned to the tutoring treatment could interact with a tutor. However, we find no impact of treatment on other measures of engagement, including number of interactions with the staff on the forum, any interaction with staff on the forum, any activity on the forum, and number of posts on the forum.
For the regular study time (RST) intervention, we used assignment to RST as an instrument for opting in to RST to estimate the treatment effect on our eight primary outcomes. We find a small and positive but insignificant impact of opting in on seven of the eight outcomes. Similarly, we find a small and positive impact of opting in to email reminders on five of the eight outcomes, again insignificant. We find no significant impact of assignment to RST on various measures of engagement with the course, including hours spent on course, total activity count, and total sessions (all three estimated separately for the full course, the first half only, and the second half only).
Data Publication: All data was cleaned and identified prior to analysis. The complete set of de-identified raw and clean/coded data, as well as all Stata programs that clean and analyze the data are prepared for publication. According to the MOU set up with MITx at the outset of our research, we do not have permission to share “learner data” (scores and activity tracking data collected) outside of our research team. Depending on permission granted from MITx, we will make all or some portion of the de-identified data available on the J-PAL Dataverse.