Examining responses to advertising on misinformation websites

Last registered on December 07, 2022

Pre-Trial

Trial Information

General Information

Title
Examining responses to advertising on misinformation websites
RCT ID
AEARCTR-0009973
Initial registration date
August 29, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 29, 2022, 5:16 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
December 07, 2022, 3:46 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation
Carnegie Mellon University
PI Affiliation
Stanford University
PI Affiliation
Stanford University

Additional Trial Information

Status
In development
Start date
2022-08-29
End date
2023-03-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Online misinformation is largely financially sustained via advertising revenue from ads placed automatically on misinformation websites by digital platforms. The financial motivation to earn advertising revenue by spreading misinformation has been widely conjectured to be one of the main reasons misinformation continues to be prevalent across various digital platforms. Despite attempts from digital media platforms to reduce ad revenue going to misinformation websites and the existence of tools that companies can adopt to avoid advertising on misinformation websites, ads from multiple well-known companies continue to appear on misinformation websites. We conduct randomized experiments to measure how people react to information about the role played by advertising companies and digital ad platforms in placing ads on misinformation websites. In our first study, we measure how consumption behavior changes when people receive different pieces of information about ads appearing on misinformation websites. Additionally, we measure whether people respond to our information treatments in other ways such as by voicing their concerns as well as by changing their attitudes, beliefs and preferences. Our second study is focused on measuring the beliefs, preferences and choices of employees and decision-makers within companies about ads appearing on misinformation websites. In addition to measuring the stated preferences, revealed preferences and baseline beliefs of employees and decision-makers within companies about ads appearing on misinformation websites, we measure how their behavior changes in response to the information provided.
External Link(s)

Registration Citation

Citation
Ahmad, Wajeeha et al. 2022. "Examining responses to advertising on misinformation websites." AEA RCT Registry. December 07. https://doi.org/10.1257/rct.9973-4.0
Experimental Details

Interventions

Intervention(s)
In our first study, we measure how people respond to information about advertising on misinformation websites. Using an incentive-compatible design, we measure how people change their consumption and voice concerns about advertising company and platform practices in response to the information provided. For additional details about this experiment, see the attached plan.

In our second study, we measure how employees and decision-makers within companies respond to information about advertising on misinformation websites. We measure their baseline beliefs and preferences relevant to advertising on misinformation websites, and measure their behavior in response to the information provided.
Intervention Start Date
2022-08-29
Intervention End Date
2023-02-15

Primary Outcomes

Primary Outcomes (end points)
Study 1: Our main outcome of interest is whether participants switch their gift card preference, i.e. whether participants select a different gift card after the information treatment than their top choice indicated prior to the information treatment. For additional details, see attached plan.

Study 2: Our key outcomes variables are demand for information (i.e. a binary variable that takes a value of one when participants choose to receive information on which platforms least frequently place companies' ads on misinformation websites and zero otherwise) and donation preference (i.e. a binary variable that takes a value of one when participants prefer a donation to a non-profit organization that helps digital ad platforms identify and reduce advertising on misinformation websites and zero otherwise).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Study 1: Participants are given the option to sign a real online petition. For this outcome, we record a series of variables. We first have a variable which indicates whether people have the intention to sign the petition. Additionally, we record whether people clicked on the petition link provided and whether they report having signed the petition. Finally, we can count the number of signatures on the petition pages. Additionally, we record people's stated preferences and self-reported attitudes about misinformation and company practices.

Study 2: We measure participants' posterior beliefs about the role played by digital ad platforms in placing ads on misinformation websites following our information treatment.
Secondary Outcomes (explanation)
Study 1: The petition outcome serves two purposes. First, it allows us to measure another way in which people may respond to our information treatments other than changing their consumption behavior. Second, since participants must choose between signing either company or platform petitions, this outcome allows us to measure whether, across our treatments, people hold advertising companies more responsible than the digital ad platforms that automatically place ads for companies. Apart from measuring the proportions of our petition outcomes across our randomized groups, we will also measure the difference between the company and platform petition outcomes. To measure stated preferences, we ask participants about their degree of agreement to general statements about misinformation on a seven-point scale ranging from "strongly agree'' to "strongly disagree''. We similarly record self-reported attitudes towards respondents' top choice gift card company. We also measure people's beliefs about the role played by digital ad platforms following our information treatments on a continuous scale. See attached plan.

Study 2: Posterior beliefs are measured on a continuous scale. Participants are told about the average number of advertising companies whose ads appear per month on misinformation websites that do not use digital ad platforms. They are then asked to estimate the average number of advertising companies per month whose ads appear per month on misinformation websites that do use digital ad platforms. We will winsorize the responses collected to remove outliers.

Experimental Design

Experimental Design
Study 1:
After eliciting demographic information and news preferences, we inform participants that one in five (i.e. 20% of all respondents) who
complete the survey will be offered a $25 gift card from a company of their choice out of six company options. Respondents are asked to rank the six different gift card companies on a scale from their first choice (most preferred) to their sixth choice (least preferred). The order in which the six companies are presented is randomized at the respondent level. We then ask participants to confirm which gift card they would like to receive, if they are selected. This question serves to check whether respondents have consistent preferences regardless of the type of question asked to elicit their most preferred gift card. Additionally, we ask respondents to assign weights to each of the six gift card options. This question gives respondents greater flexibility by allowing them to indicate the possibility of indifference or no preference (i.e., equal weights) between any set of options. Respondents then report how frequently they have used each of the six companies in the past 12 months.

All participants in the experiment are provided with baseline information on misinformation and advertising. Participants are then randomized into five groups to receive different information treatments, which are all based on factual information from prior research and our data. We use an active control design in order to isolate the effect of providing information relevant to the practice of specific companies on people's behavior.

Our experiment includes the following five randomized groups:

(1) Control: Participants in the control group are given generic information based on prior research that is unrelated to any specific company but relevant to topic of news and misinformation.
(2) T1 (Company only): Participants are given factual information that ads from their top choice gift card company appeared on misinformation websites in the recent past.
(3) T2 (Platform only): Participants are given factual information that companies that used digital ad platforms were about 10 times more likely to appear on misinformation websites than companies that did not use such platforms in the recent past.
(4) T3 (Company and platform): This arm combines information from T1 and T2. Similar to T1, participants are given factual information that ads from their top choice gift card company appeared on misinformation websites in the recent past. Additionally, we inform participants that their top choice company used digital ad platforms and that companies that used such platforms were about 10 times more likely to appear on misinformation websites than companies that did not use digital ad platforms.
(5) T4 (Company ranking): Participants are given factual information that ads from all six gift card companies appeared on misinformation websites in the recent past along with a ranking based on the order of their intensity of advertising on misinformation websites. We personalize these rankings based on data from different years (i.e. 2019, 2020 or 2021) such that the respondents' top gift card choice company does not appear last in the ranking (i.e. is not the company that advertises least on misinformation websites) and in most cases, advertises more intensely on misinformation websites than its potential substitute in the same company category (fast food, food delivery or ride-sharing).

After the information treatment, all participants are asked to make their final gift card choice from the same six options they were shown earlier. To ensure incentive compatibility, participants are told that those who are randomly selected to receive a gift card will be offered the gift card of their choice at the end of our study. Additionally, participants are given the option to sign a real online petition and report their preferences and attitudes.

Study 2:
We first elicit participants' current employment status. All those who are working in some capacity are allowed to continue the survey whereas the rest of the participants are screened out. After asking for participants' main occupation, all participants in the experiment are provided with baseline information on misinformation and advertising. To record their stated preferences, participants are then asked to share how much they agree or disagree with a series of statements on a scale from strongly disagree to strongly agree. The order in which these statements are presented is randomized for all participants. To record their revealed preferences, participants are asked three questions in a randomized order: 1) Information demand about consumer responses, i.e. whether they would like to learn how people respond to companies whose ads appear on misinformation websites, 2) Ad check, i.e. whether they would like to know about their own company's ads appearing on misinformation websites in the recent past, and 3) Demand for solution, i.e. whether they would like to sign up for an information session on how companies can manage whether their ads appear online. For information demand about consumer responses, participants are told that they would be provided with this information at the end of our survey if they choose to receive it. For the ad check and demand for solution, participants are told that they would receive this information in a follow-up email after survey completion.

Next, we record participants' baseline beliefs about the role of companies and platforms in placing ads on misinformation websites. Participants are then randomized into a treatment group, which receives information about the role of digital ad platforms in placing ads on misinformation websites, and a control group, which does not receive this information. Following this information intervention, we inquire participants' posterior beliefs about the role played by digital ad platforms in placing ads on misinformation websites as well as their demand for information and donation preference. We then ask participants to answer a few additional questions about their background, preferences, place of work, and feedback on the survey. Participants who opted to learn more in our information demand questions are then provided with this information at the end of our survey.
Experimental Design Details
For additional details, please see the attached plan.
Randomization Method
Randomization will be carried by the survey software - Qualtrics.
Randomization Unit
Randomization will be at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
The number of clusters will be the number of individuals recruited for the survey experiment.
Sample size: planned number of observations
Study 1: We expect about 3750 individuals in the study. Study 2: We expect to send out email invitations to approximately 120,000 individuals. Assuming a click rate of 1% and a completion rate of 50%, we expect about 600 responses in the study.
Sample size (or number of clusters) by treatment arms
Study 1: Participants are evenly randomized into five groups. We expect approximately 750 individuals in each randomized group.
Study 2: Participants are evenly randomized into two groups. We expect approximately 300 individuals in each randomized group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
Administrative Panels for the Protection of Human Subjects, Stanford University
IRB Approval Date
2022-07-26
IRB Approval Number
63897
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials