Information Bundling and Polarizing Persuasion

Last registered on August 05, 2024

Pre-Trial

Trial Information

General Information

Title
Information Bundling and Polarizing Persuasion
RCT ID
AEARCTR-0012729
Initial registration date
January 29, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
January 31, 2024, 12:14 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
August 05, 2024, 8:40 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
University of California San Diego (UCSD)

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2024-02-15
End date
2024-09-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We study how bundling together policy recommendations on different policy issues affects voters' policy views. Voters are randomized into (i) treatment messages, each consisting of policy recommendations on two policy issues, and (ii) control messages, where policy recommendations are sent separately. The issues bundled differ in ideological value, policy domain, and complexity. We investigate the presence of belief spillovers across policy domains, and the role played by trust and identity in explaining these spillovers.
External Link(s)

Registration Citation

Citation
Bonomi, Giampaolo. 2024. "Information Bundling and Polarizing Persuasion." AEA RCT Registry. August 05. https://doi.org/10.1257/rct.12729-2.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2024-07-02
Intervention End Date
2024-08-04

Primary Outcomes

Primary Outcomes (end points)
Policy preferences on economic policy issues (trade policy, healthcare regulations, and redistributive policies) and social policy issues (abortion policy, affirmative action, LGBTQ+ rights, etc)
Primary Outcomes (explanation)
Outcomes are captured on a strongly support - strongly oppose 4-point scale, and will be recoded as binary as support or oppose.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The experiment is implemented online on Amazon Mechanical Turk via Cloud Research. Before randomization, we collect the socio-demographic characteristics of respondents, as well as their perceived importance and knowledge of the policy issues used as main outcomes. Participants are then randomized into (i) treatment messages, each consisting of written policy recommendation on two policy issues, and (ii) control messages, where the same policy recommendations are reported separately (respondents observe one recommendation at a time). The issues bundled differ in ideological value, policy domain, and complexity. Before and after reading the messages, respondents are asked to provide their policy preference for the two issues (in the main experiment, one preference is asked before treatment while the other is asked after treatment).
Experimental Design Details
Randomization Method
Randomization performed through Qualtrics' block randomization option.
Randomization Unit
Individual randomization
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
450-500 subjects for each treatment arm.
Main experiment: 5,000 individuals for the main policy pair (abortion and trade)
For other policy pairs, the main treatments will be implemented, up to an (estimated) total of 12,000 participants taking part in the research study.
Sample size: planned number of observations
450-500 subjects for each treatment arms
Sample size (or number of clusters) by treatment arms
For each pair of proposals on issues X and Y, an equal number of respondents will be allocated to the following three conditions: (i) X and Y made by the same source (ii) X and Y sent separately (i.e., not sent together by the same source). (iii) no message shown. Subjects will be assigned evenly to conditions. We plan to assign 450-500 subjects to each condition.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
UCSD Office Of IRB Administration
IRB Approval Date
2023-10-30
IRB Approval Number
809110
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials