x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Professor will this be on the Exam?
Last registered on January 12, 2018

Pre-Trial

Trial Information
General Information
Title
Professor will this be on the Exam?
RCT ID
AEARCTR-0002669
Initial registration date
January 12, 2018
Last updated
January 12, 2018 5:56 PM EST
Location(s)
Primary Investigator
Affiliation
Baruch CUNY and NBER
Other Primary Investigator(s)
PI Affiliation
MIT
PI Affiliation
The Graduate Center, CUNY
Additional Trial Information
Status
In development
Start date
2017-12-27
End date
2018-08-30
Secondary IDs
Abstract
Today's students face many pressures vying to capture their attention, and undergraduate curricula increasingly rely on self-directed content exposure that occurs outside of the traditional classroom. This combination may exacerbate behavioral failures inhibiting human capital production. In a previous experiment we found that email nudges to focus on certain ungraded problems available online had little impact on student performance on questions very similar to those on the exam. We realized that students focused on the graded assignments with relatively few attempting to do the ungraded “nudged” problems. In this study we test whether nudges to focus on graded assignments with the message that such problems are likely to appear on the exam sent to a random subset of students within each class help students focus on core concepts more than when graded assignments are not nudged. In addition, we test whether nudges to practice ungraded problems also with the hint that “questions like these are likely to appear on the exams” increases student time spent online with the ungraded problems and enhances performance on the test. Nudges to focus on practice problems will also be sent to a random sub-sample within each class. Lastly we will vary the amount each graded assignment counts towards their final grade to see if exercises that count more garner more online effort and improve performance on exams.
External Link(s)
Registration Citation
Citation
Dench, Daniel, Theodore Joyce and Stephen O'Connell. 2018. "Professor will this be on the Exam?." AEA RCT Registry. January 12. https://doi.org/10.1257/rct.2669-1.0.
Former Citation
Dench, Daniel et al. 2018. "Professor will this be on the Exam?." AEA RCT Registry. January 12. http://www.socialscienceregistry.org/trials/2669/history/24902.
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2018-01-27
Intervention End Date
2018-05-20
Primary Outcomes
Primary Outcomes (end points)
Probability that a problem set was completed (binary), correct answer to questions corresponding to intervention problems (binary), combined score on questions corresponding to intervention problems (integer), Activity access corresponding to intervention (binary), combined activity access corresponding to intervention (integer), total time accessing activities related to intervention (log).
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
The study design will use within- classroom variation in graded problems to identify the effect of online practice problems in teaching basic economic concepts to students. The study will also use within- person variation in the percent towards the final grade that each question is worth from week to week.
Experimental Design Details
Randomization occurs at the classroom level, into two equal groups, call them A and B.In several weeks out of the semester students can complete online questions that count toward their final grade.In half of these weeks, group A will receive problem set X measuring important basic economic concepts covered that week. Group B will have available problem set X but as ungraded practice problems. Students in group A and B will be nudged to focus on problem set X. During the same weeks, group B will be assigned problem set Y measuring different but equally challenging economic concepts covered that week. Group A will have available problem set Y as ungraded practice problems. Neither groups A or B will be nudged regarding problem set Y. The structure allows us to contrast the effect of graded versus ungraded problems that are nudged on performance on exams. We can compare those differences with performance on the exam for problem set Y that was graded and ungraded without nudges. We hypothesize that nudging will narrow the probability of doing a graded versus an ungraded problem set. We can also isolate the effect of nudging more generally by estimating the difference in differences in the probability of doing problem sets X and Y as well as performance on exam questions similar to those in sets X and Y. Lastly, we can test whether problem sets that count more towards student’s grade are more likely to be completed and more likely to improve student performance on exams.
Randomization Method
Done by program in stata by classroom using set seed and runiform() packages.
Randomization Unit
Individual/Student
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
0 clusters
Sample size: planned number of observations
Approximately 660 students, although this may depend on registration and course drop patterns after the semester begins. In some estimations we will use two observations per student (for midterm and final exams) or estimate directly on question-level outcomes (as detailed in the full analysis plan).
Sample size (or number of clusters) by treatment arms
Approximately 330 individuals per treatment arm, with 2 treatment arms (A,B)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
0.2 standard deviations on overall exam score, and 0.1 standard deviations in number of attempts of online practice problems.
Supporting Documents and Materials

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
CUNY University Integrated Institutional Review Board
IRB Approval Date
2017-12-01
IRB Approval Number
#2015-1310
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS