Which information sources are more trusted and why?

Last registered on April 28, 2022

Pre-Trial

Trial Information

General Information

Title
Which information sources are more trusted and why?
RCT ID
AEARCTR-0009316
Initial registration date
April 26, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 28, 2022, 6:20 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Nottingham

Other Primary Investigator(s)

PI Affiliation
The Fuqua School of Business, Duke University
PI Affiliation
The Fuqua School of Business, Duke University
PI Affiliation
University of Nottingham
PI Affiliation
Duke University

Additional Trial Information

Status
In development
Start date
2022-04-27
End date
2022-04-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We study how much people trust different information sources and why? In particular, we study how much they trust crowdsourced information compared to other sources and which factors affect the trust in crowdsourced information. We study this in the case of information about pharmaceuticals.
External Link(s)

Registration Citation

Citation
Hinnosaar, Marit et al. 2022. "Which information sources are more trusted and why?." AEA RCT Registry. April 28. https://doi.org/10.1257/rct.9316-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2022-04-27
Intervention End Date
2022-04-30

Primary Outcomes

Primary Outcomes (end points)
See Experimental Design
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Experimental design will remain hidden until the trial is complete.
Experimental Design Details
We conduct an online survey experiment. We measure how much people update their beliefs about pharmaceutical side effects as a response to informational treatments.

We use a 5*5 experimental design with two sets of treatments. In the first set of five experimental treatments, we provide the same information to all participants while truthfully varying which source to cite. We compare participants' incentivized estimates of the probability of a specific side effect. We compare the estimates in the case of the following information sources: Wikipedia, scientists, the government (Food and Drug Administration), and a pharmaceutical company (with or without mentioning that pharmaceutical companies have reported the probability of side effects to be lower than found in independent studies).

In the second set of five experimental treatments, we experimentally vary the true information that we give about the characteristics of writers of Wikipedia. Specifically, we compare participants' incentivized estimates of the probability of a specific side effect while participants all see the same information about side effects in Wikipedia. We compare the estimates while varying information about who writes Wikipedia: the number of Wikipedia writers, writers’ qualifications (medical doctors) and potential conflict of interest.

We test the following hypothesis. First, is crowdsourced information from Wikipedia more or less trusted than scientists, the government (the Food and Drug Administration), or a pharmaceutical company. Second, does the size of the crowd and their qualifications affect the trust? Third, does writers’ potential conflict of interest affect the trust in crowdsourced information? Fourth, do the political leanings of the readers affect the trust? Fifth, does confidence in one's prior belief affect trust? Sixth, does personal experience with pharmaceuticals and side effects affect the trust? Seventh, does experience with crowdsourced information affect trust in crowdsourced information from Wikipedia? Finally, we study how demographic characteristics such as gender, age, and race, affect trust in these information sources?

The main outcome variable measures updating the belief about the likelihood of pharmaceutical side effects.

Our analysis focuses on individuals who are Bayesian updaters. We exclude from the sample participants who complete the survey unreasonably fast.
Randomization Method
Qualtrics survey platform
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
No clusters; 1750 individuals
Sample size: planned number of observations
1750 individuals
Sample size (or number of clusters) by treatment arms
70
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
NBER Institutional Review Board
IRB Approval Date
2022-04-18
IRB Approval Number
22_047
IRB Name
The Nottingham School of Economics Research Ethics Committee
IRB Approval Date
2022-03-02
IRB Approval Number
N/A

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials