What drives support for inefficient corrective policies? Evidence from an energy ballot initiative

Last registered on October 06, 2020

Pre-Trial

Trial Information

General Information

Title
What drives support for inefficient corrective policies? Evidence from an energy ballot initiative
RCT ID
AEARCTR-0006399
Initial registration date
October 05, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 06, 2020, 7:27 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region
Region

Primary Investigator

Affiliation
Department of Agricultural and Resource Economics, UC Berkeley

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2020-10-07
End date
2020-11-12
Secondary IDs
Abstract
In this paper I use an information provision experiment conducted about a vote on Nevada’s Renewable Portfolio Standard (RPS) to understand how voter beliefs inform support for price versus performance-based policies. By modeling how voter's perception of policy attributes (cost, effectiveness, and equity) map to voting behavior, I will be able to decompose differences in policy support into mutable misperceptions of policy attributes versus differential aversion to certain policies (i.e., a general distaste for tax-based policy). These results have important implications for designing politically feasible corrective policies.
External Link(s)

Registration Citation

Citation
Tarduno, Matthew . 2020. "What drives support for inefficient corrective policies? Evidence from an energy ballot initiative." AEA RCT Registry. October 06. https://doi.org/10.1257/rct.6399
Sponsors & Partners

Sponsors

Experimental Details

Interventions

Intervention(s)
[See PAP for details]

Voters will be asked whether they support an RPS ballot initiative, as well as whether they would support a (hypothetical) carbon tax ballot initiative. They will then be asked about: (a) the costs of each policy, (b) the efficacy of each policy, and (c) the distribution impacts of each policy.

Respondents will then be provided with source-randomized, non-deceptive information about these policy attributes. I will use the variation in beliefs about cost, efficacy, and regressivity induced by the information intervention to identify how voter beliefs determine support or opposition for corrective policies.
Intervention Start Date
2020-10-07
Intervention End Date
2020-11-12

Primary Outcomes

Primary Outcomes (end points)
See PAP
Primary Outcomes (explanation)
See PAP

Secondary Outcomes

Secondary Outcomes (end points)
See PAP
Secondary Outcomes (explanation)
See PAP

Experimental Design

Experimental Design
The experiment will consist of two surveys run on Amazon Mechanical Turk:

1) Elecit Priors: Before the November 2020 election, voters will be asked whether they support an RPS ballot initiative, as well as whether they would support a (hypothetical) carbon tax ballot initiative. They will then be asked to share their beliefs about: (a) the private incidence of each policy, (b) the efficacy of each policy, and (c) the distributional impacts of each policy.

Respondents will then be provided with information on these attributes. The information provision will be source-randomized and non-deceptive. For example, if a respondent receives information on the private incidence of an RPS, this information will be taken from one of two academic papers which come to different conclusions about this policy's costs to consumers.

2) Collect Posteriors: After the election, voters will be asked how they voted on the RPS ballot initiative, as well as whether they would have supported a (hypothetical) carbon tax ballot initiative. They will then be asked to share their posterior beliefs about: (a) the costs of each policy, (b) the efficacy of each policy, and (c) the distribution impacts of each policy.




Experimental Design Details
Randomization Method
Randomization done by qualtrics.
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
800
Sample size: planned number of observations
800
Sample size (or number of clusters) by treatment arms
800
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
UC Berkeley Committee for Protection of Human Subjects (CPHS)
IRB Approval Date
2020-09-14
IRB Approval Number
00006252
Analysis Plan

Analysis Plan Documents

Pre Analysis Plan for "What drives support for inefficient corrective policies? Evidence from an energy ballot initiative"

MD5: b3a0f76fdcf258620da842d5bf3b7741

SHA1: 46752dca56faa783e6253e6cfc8ea8e9227a8912

Uploaded At: October 05, 2020

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials