Are Online News Consumers Sophisticated Enough to Detect Media Bias?

Last registered on March 06, 2023


Trial Information

General Information

Are Online News Consumers Sophisticated Enough to Detect Media Bias?
Initial registration date
May 15, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 17, 2021, 10:29 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
March 06, 2023, 3:24 PM EST

Last updated is the most recent time when changes to the trial's registration were published.



Primary Investigator

Harvard University

Other Primary Investigator(s)

PI Affiliation
Peking University
PI Affiliation
Peking University

Additional Trial Information

Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
This project examines whether news consumers are sophisticated enough to detect media bias using both observational and experimental data. The observational data from a Chinese micro-blogging platform suggests that users are more likely to repost "inconsistent" news, defined as news with attitudes different from the media outlet's usual attitudes. To explore whether this reposting pattern reflects strategic thinking, we run an online experiment that randomly varies whether news sources are revealed to news consumers and tests its effect on their inference about the importance of news, tendency of reposting etc.
External Link(s)

Registration Citation

Huang, Yihong, Juanjuan Meng and Xi Weng. 2023. "Are Online News Consumers Sophisticated Enough to Detect Media Bias? ." AEA RCT Registry. March 06.
Experimental Details


Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Probability of reposting news, inference about the importance of the news, level of agreement with the news
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We run an online experiment to test if news consumers take media bias into account when inferring the importance of news and considering whether to repost it. We randomly assign participants into two groups and show them different information related to the four pieces of news (two consistent with the media outlet's general attitudes, and two inconsistent ones).

- For those in the control group, they don't know which media outlet posted the news.
- For those in the Media names group, we reveal the media outlet that posted the news.
- For those in the Media Attitudes group, we include a short introduction about the general attitudes of the media outlets.
- For those in the Debias group, we additionally remind participants about the existence of media bias before they read news.

After reading each piece of news, participants answer the following questions:
- To what extent do you agree with the opinion expressed in the news above?
- Please guess the Baidu index of the news above. (Proxy for the importance of the news)
- What's the probability that you repost the news above?
- If you decide to repost the news above, will you add comments to show support/rejection?
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
The treatment is not clustered.
Sample size: planned number of observations
600 individuals in each round, 2 rounds in total.
Sample size (or number of clusters) by treatment arms
In each round, we will have 600 individuals, with 150 participants in each treatment arm.
In total, we will have 300 participants in each treatment arm.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Intervention Completion Date
June 30, 2021, 12:00 +00:00
Data Collection Complete
Data Collection Completion Date
June 30, 2021, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
1266 individuals.

We conducted the experiment with two different samples, with 632 and 634 participants respectively. In the first round, we conduct the experiment with college students in an elite university in China. In the second round, we collect responses from the general population.
Final Sample Size (or Number of Clusters) by Treatment Arms
322 in the control group, 301 in the Media Names group, 295 in the Media attitudes group and 348 in the Debias group.
Data Publication

Data Publication

Is public data available?

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials