Back to History

Fields Changed

Registration

Field Before After
Last Published November 07, 2021 10:23 AM November 07, 2021 10:24 AM
Back to top

Papers

Field Before After
Paper Abstract Policymakers routinely make high-stakes decisions of which programs to fund. Assessing the value of a program is difficult and may be affected by bounded rationality. In an experiment with policymakers in the U.S. government, we find that respondents’ valuations of programs are inelastic with respect to the program’s impact. A complementary experiment among a representative sample of the general public reveals even more pronounced inelasticity in a population less familiar with making program funding decisions. We design and test two portable decision aids, one which presents two alternative programs side-by-side rather than in isolation and another which translates total program cost into an annual cost per person impacted. The decision aids increase elasticity by 0.20 on a base of 0.33 among policymakers and by 0.21 on a base of 0.21 among the general public. We provide evidence that cognitive noise—noisy assessments of complex inputs—is a mechanism that can help explain the observed inelasticity of program valuation with respect to impact. Policymakers routinely make high-stakes decisions of which programs to fund. Assessing the value of a program is difficult and may be affected by bounded rationality. In an experiment with policymakers in the U.S. government, we find that respondents’ valuations of programs are inelastic with respect to the program’s impact. A complementary experiment among a representative sample of the general public reveals even more pronounced inelasticity in a population less familiar with making program funding decisions. We design and test two portable decision aids, one which presents two alternative programs side-by-side rather than in isolation and another which translates total program cost into an annual cost per person impacted. The decision aids increase elasticity by 0.20 on a base of 0.33 among policymakers and by 0.21 on a base of 0.21 among the general public. We provide evidence that cognitive noise—noisy assessments of complex inputs—is a mechanism that can help explain the observed inelasticity of program valuation with respect to impact.
Back to top