Back to History

Fields Changed

Registration

Field Before After
Last Published November 02, 2018 10:30 AM June 20, 2024 05:28 PM
Keyword(s) Education Education
Building on Existing Work No
Back to top

Papers

Field Before After
Paper Abstract We conducted a randomized factorial experiment to determine how displaying school information to parents in different ways affects what schools they choose for their children in a hypothetical school district. In a sample of 3,500 low-income parents of school-aged children, a small design manipulation, such as changing the default order in which schools were presented, induced meaningful changes in the types of schools selected. Other design choices such as using icons to represent data, instead of graphs or just numbers, or presenting concise summaries instead of detailed displays, also led parents to choose schools with higher academic performance. We also examined effects on parents’ understanding of the information and their self-reported satisfaction and ease of use. In some cases, there were trade-offs. For example, representing data using only numbers maximized understanding, but adding graphs maximized satisfaction at the expense of understanding.
Paper Citation Steven Glazerman, Ira Nichols-Barrer, Jon Valant, Jesse Chandler & Alyson Burnett (2020) The Choice Architecture of School Choice Websites, Journal of Research on Educational Effectiveness, 13:2, 322-350, DOI: 10.1080/19345747.2020.1716905
Paper URL https://doi.org/10.1080/19345747.2020.1716905
Back to top
Field Before After
Paper Abstract Background: Researchers often wish to test a large set of related interventions or approaches to implementation. A factorial experiment accomplishes this by examining not only basic treatment–control comparisons but also the effects of multiple implementation “factors” such as different dosages or implementation strategies and the interactions between these factor levels. However, traditional methods of statistical inference may require prohibitively large sample sizes to perform complex factorial experiments. Objectives: We present a Bayesian approach to factorial design. Through the use of hierarchical priors and partial pooling, we show how Bayesian analysis substantially increases the precision of estimates in complex experiments with many factors and factor levels, while controlling the risk of false positives from multiple comparisons. Research design: Using an experiment we performed for the U.S. Department of Education as a motivating example, we perform power calculations for both classical and Bayesian methods. We repeatedly simulate factorial experiments with a variety of sample sizes and numbers of treatment arms to estimate the minimum detectable effect (MDE) for each combination. Results: The Bayesian approach yields substantially lower MDEs when compared with classical methods for complex factorial experiments. For example, to test 72 treatment arms (five factors with two or three levels each), a classical experiment requires nearly twice the sample size as a Bayesian experiment to obtain a given MDE. Conclusions: Bayesian methods are a valuable tool for researchers interested in studying complex interventions. They make factorial experiments with many treatment arms vastly more feasible.
Paper Citation Kassler, D., Nichols-Barrer, I., & Finucane, M. (2020). Beyond “Treatment Versus Control”: How Bayesian Analysis Makes Factorial Experiments Feasible in Education Research. Evaluation Review, 44(4), 238-261. https://doi.org/10.1177/0193841X18818903
Paper URL https://doi.org/10.1177/0193841X18818903
Back to top