Back to History

Fields Changed

Registration

Field Before After
Trial Title Evaluating Meta’s Policy Interventions to Prevent the Spread of Health Misinformation: Experimental evidence from the Global South Evaluating Policy Interventions to Prevent the Spread of Health Misinformation on Social Media: Experimental evidence from the Global South
Abstract Belief in misinformation causes confusion, reduces trust in authorities and encourages risky behaviours that can cause significant harm to health, as exemplified by the COVID-19 pandemic. Social media platforms have taken several policy measures to address this challenge; working with independent fact-checking companies to label inaccurate content, promoting verified information through prompts of fact-checked articles, or tailoring the algorithm to demote false posts in the newsfeed. But how effective are these measures? I aim to address this issue with a focus on Facebook and its policies to combat health-related misinformation in the context of the Global South. My study has three key goals. First, I will evaluate the effectiveness of specific policies currently used by Facebook to debunk misinformation using an online experiment. Second, I aim to examine design tweaks informed by behavioural science to improve the effectiveness of these existing policies. Finally, I will examine how core aspects of users’ identities and prior beliefs interact with the content of inaccurate posts to impact the efficacy of the labelling policies, accounting for potential demand-side factors that contribute to the spread of misinformation. I will collect data in two waves, spaced two - three weeks apart to measure if the effects endure over a period of time. Belief in misinformation causes confusion, reduces trust in authorities and encourages risky behaviours that can cause significant harm to health, as exemplified by the COVID-19 pandemic. Social media platforms have taken several policy measures to address this challenge; working with independent fact-checking companies to label inaccurate content, promoting verified information through prompts of fact-checked articles, or tailoring the algorithm to demote false posts in the newsfeed. But how effective are these measures? I aim to address this issue with a focus on Facebook and its policies to combat health-related misinformation in the context of the Global South. My study has three key goals. First, using an online survey experiment I will evaluate the effectiveness of a specific label currently used by Facebook to debunk misinformation. Second, I aim to examine design tweaks informed by behavioural science to improve the effectiveness of the existing labels. Finally, I will examine if introducing a very low-cost and scalable digital media literacy intervention increase discernment between true and false, as well as increases the effectiveness of the label in debunking misinformation. I will collect data in two waves, spaced two weeks apart to measure if the effects endure over a period of time.
Trial Start Date November 21, 2022 November 25, 2022
Last Published November 18, 2022 12:31 PM November 23, 2022 07:53 PM
Intervention (Public) Interventions: 1. "Missing context" and "Partly-false information" labels attached to inaccurate Facebook posts by the third-party fact-checking organisations that Meta collaborates with. 2. Promotion of links to related fact-checked articles. As part of an online survey experiment, I will show all participants screenshots of five real Facebook posts: four that contain true information, and one that has either been labelled as ‘missing context’ or 'partly false' c Participants in the no label condition will see the post without labels, and those in the labelled conditions will see the posts with the original labels. In one treatment, I will show participants a slightly modified 'missing context' label that is more salient than the original one. I am testing the effects of a specific label used by Facebook to flag inaccurate posts as such. Interventions: 1. Label attached to inaccurate posts by third-party fact-checking organisations working with Meta 2. A slightly modified label that is more salient than the original one. 3. Original label + Digital media literacy intervention (a simple low-cost and scalable intervention)
Intervention Start Date November 21, 2022 November 25, 2022
Planned Number of Clusters 4500 4800
Planned Number of Observations 4500 4800
Sample size (or number of clusters) by treatment arms 900 600 per treatment arm, per stimuli
Intervention (Hidden) In the salience treatment, I will replace the black and white information sign that Facebook applies to the label with a red warning sign. As part of an online survey experiment, I will show all participants screenshots of five real Facebook posts: four that contain true information, and one that has been labelled as ‘missing context’ by the third-party fact-checking organisations that Meta collaborates with. Facebook defines missing context as "implying a false claim without directly stating it". Amongst all the posts that are fact-checked and labelled on Facebook, posts labeled as 'Missing Context' are the most likely to be encountered by users as they are not demoted by the Facebook algorithm in the users' newsfeed. Participants in the no label condition will see the post without the label, and those in the labelled conditions will see the posts with the original label. There will be three treatments: 1. The original 'Missing context' label attached to the inaccurate post by third-party fact-checking organisations 2. Salience - I will replace the black and white information sign that Facebook applies to the label with a more salient red warning sign. 3. Original label + Digital media literacy intervention. I will include a screenshot of a post from the United Nations Facebook page that urges users to stop before they share and think about who wrote it, why, source of the information, etc. This post will be shown as a "sponsored" post as promoting digital media literacy information in the form of Facebook ads would be an easy to implement, low-cost and scalable intervention that can be adopted by Facebook or by governments/ international health organisations.
Secondary Outcomes (End Points) Trust in science, experts and authorities; Conspiracist ideation Performance on the cognitive reflection test Trust in science, experts and authorities; Conspiracist ideation
Secondary Outcomes (Explanation) How much do you trust each of the following to give you correct health-related news and information? Scientists, doctors and other health experts (1) The Indian Government (2) TV News Channels (3) Pharmaceutical and biotechnology companies such as Serum Institute of India, Astra Zeneca, Pfizer or Moderna (4) Social Media / Tech giants like Facebook (5) Fact-checker organisations that verify the accuracy of online viral posts (6) Response options: A trust them a lot (2) I somewhat trust them (1) I don't trust them (0) How much do you trust each of the following to give you correct health-related news and information? Scientists, doctors and other health experts (1) The Indian Government (2) TV News Channels (3) Pharmaceutical and biotechnology companies such as Serum Institute of India, Astra Zeneca, Pfizer or Moderna (4) Social Media / Tech giants like Facebook (5) Fact-checker organisations that verify the accuracy of online viral posts (6) Response options: I trust them a lot (2) I somewhat trust them (1) I don't trust them (0) Conspiracist ideation: There is often debate about whether or not the public is told the whole truth about various important issues. The following questions are designed to assess your beliefs about some of these subjects. Please indicate the degree to which you believe each statement is likely to be true. (5 point scale) "The spread of certain viruses and/or diseases is the result of the deliberate, secret efforts of some organisation." "Mind-controlling technology is used manipulatively on people without their knowledge." "A lot of important information about diseases and treatments is deliberately kept secret from the public." "Some viruses and/ or diseases which many people are infected with were created in a lab as bio-weapons." Response options: It's definitely true (4) It's likely to be true (3) Not sure (2) It's likely to be false (1) It's definitely false (0)
Back to top