Can journalists be empowered through training and resources to counter misinformation?

Last registered on July 21, 2022

Pre-Trial

Trial Information

General Information

Title
Can journalists be empowered through training and resources to counter misinformation?
RCT ID
AEARCTR-0009749
Initial registration date
July 14, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 21, 2022, 11:26 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
ITAM

Other Primary Investigator(s)

PI Affiliation
ITAM
PI Affiliation
Columbia University
PI Affiliation
ITAM
PI Affiliation
ITAM

Additional Trial Information

Status
Completed
Start date
2020-05-01
End date
2021-04-15
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
In the midst of the COVID-19 pandemic and increasingly polarized politics, misinformation---including deliberately deceptive disinformation---has become widespread. This has been especially true of the Global South, where journalistic standards and fact-checking institutions are often comparatively less developed. The rise of misinformation has largely been fueled by social media, which has become a central source of information for many citizens in such contexts. Bolivia---the context of this study---has been particularly subjected to increasing misinformation, fueled by its political turmoil and the COVID-19 pandemic. In principle, journalists---the key producers of credible content---could play a central role in countering the dissemination of misinformation. However, journalists in low- and middle-income countries often lack training to detect and fact-check likely misinformation, the time to extensively corroborate source content, or the capacity or incentive to produce appealing articles to debunk it or compete against it. Given shrinking newsrooms, journalists have often amplified the dissemination of misinformation around COVID-19 and politics by reproducing viral news without fact-checking it, whether wittingly or unwittingly. To evaluate the extent to which providing training on misinformation and resources to journalists can overcome these challenges, Internews conducted a randomized intervention among journalists in Bolivia. Out of a pool of approximately 350 applicants, 145 journalists were screened as eligible to participate in the program. Out to those, 73 ``treated'' journalists were randomly selected to receive an invitation to participate in the program, while 72 ``control'' were not invited. The intervention provided treated local journalists with: (i) training to identify misinformation and engage in fact checking, so that they could produce stories that can combat and outcompete misinformation; (ii) seed funding to produce an original investigative journalistic content relating to misinformation; (iii) information about trending likely misinformation, as well as fact checks recently conducted by local fact checkers, to help journalists identify relevant topics for their regular work; and (iv) online materials advising on how to communicate fact checks. Ultimately, the goal was to assess whether this bundle of interventions could reduce the production and social media sharing of misinformation and increase the production and sharing of content that corrects misinformation. To evaluate the effectiveness of Internews' intervention, we will consider a series of outcomes. First, we will examine differences in knowledge and behavior between treated and control journalists about how to identify misinformation, engage in fact checking, and produce and share content to outcompete misinformation. Moreover, we will assess differences using outcomes that measure whether journalists report producing and sharing journalistic content to outcompete misinformation. To that end, we conducted a survey between March and April 2021---roughly a year after the training concluded. Second, we will look at differences in the characteristics and popularity of the content actually produced and shared by treated and control journalists, which complement the self-reported outcomes from the survey. To that end, we have scraped all publicly available content produced and shared online and on social media by treated and control journalists. We will use machine learning techniques to assess the extent to which the content produced and shared by treated journalists is of high quality in terms of informational content and style, whether it resembles misinformation, and whether it seeks to combat misinformation. Moreover, we will analyze differences in the popularity of content produced and shared by treated and journalists on social media, based on content shares and reactions on Facebook and Twitter. Measuring such popularity is important because the content that tries to debunk misinformation may not be as popular as the information that journalists are trying to debunk. Third, we will also assess citizen reactions to the content produced by the journalists that were and were not part of the program. We conducted a survey of Bolivian citizens between December 2020 and March 2021, which included a second randomized evaluation that showed respondents titles and articles produced by treated and control journalists on the topics of health and politics and elicited their perceptions about those titles and articles.
External Link(s)

Registration Citation

Citation
Bandiera, Antonella et al. 2022. "Can journalists be empowered through training and resources to counter misinformation?." AEA RCT Registry. July 21. https://doi.org/10.1257/rct.9749-1.0
Experimental Details

Interventions

Intervention(s)
Internews' journalist misinformation training program consisted of three core elements: three training workshops and funding to write an article about misinformation; the dissemination of reports about trending misinformation; and videos and infographics advising journalists on how to write articles and fact-checks in a compelling and accurate way. We next describe each element of this bundled treatment.

Misinformation workshops and seed funding. Participating journalists first took part in three workshops on misinformation over the course of three weeks. The workshops were led by Internews and Maldita.es, a recognized fact-checker in Spain, over Zoom (due to COVID-19 precautions). To maximize attendance and the interaction between the workshop instructor and participants, journalists were split into two different groups: the first group attended workshops on Saturday mornings, and the second on Thursday nights. Each workshop lasted two hours, and thus each treated journalist could receive up to 6 hours of training on misinformation.

Each workshop focused on different issues relating to misinformation. The first workshop covered basic concepts relating to misinformation, how it disseminates over social media, and how different organizations work to fight against it. To reinforce learning, attendees had to identify examples of misinformation related content as part of their first homework assignment. The second workshop briefly reviewed the content imparted during the first workshop before turning to methodologies and tools to fact-check potential misinformation, including advanced text and reverse-image searching, photo forensics, metadata checking, geo-localization checking with tools such as Google Maps, among other techniques. These techniques were then reinforced through the test for the second homework assignment. The final workshop included several applications of the second workshop's tools and an overview of best practices to produce and publish fact-checks. As part of their last assignment, journalists were asked to employ these newly learned practices and guidelines to produce and publish content around misinformation. Subject to project proposal approval, treated journalists were provided approximately with 200 USD in seed funding to facilitate writing this article.

Though different reminders and messages were sent to the 73 journalists that were invited to participate in the training, 63 (86\%) ultimately attended at least one workshop and the average journalist attended 2.5 of the 3 workshops. Of these journalists, 53 (72\% of all journalists) completed and submitted the three post-training assignments previously mentioned. Since completion of assignments was a key criteria for eligibility to Internews’ fellowship support fund, only this set of trainees were eligible for seed funding. However, several trainees were not interested in the seed fund. This resulted in 49 proposals being submitted to conduct projects as part of the program. After revision of proposals, three required a second revision. At the end, a total of 46 proposals were approved and 44 were executed and delivered by the journalists.

Virality and misinformation reports. To increase the effectiveness of the training by providing source material to motivate potential topics for journalists to address, a daily distribution of virality reports was implemented between the end of September and and mid December. These reports informed treated journalists about the most viral news of the day---defined by the articles with the greatest social engagement on Facebook, according to Crowdtangle. Journalists also received weekly misinformation reports covering the most popular fact-checks of the week. Specifically, the weekly fact-checks from the two main Bolivian fact-checking organizations---Chequea Bolivia and Bolivia Verifica---were examined, and again Crowdtangle was used to identify the ten most viral fact-checks on Facebook.

Infographics and video on how to write a fact-check. Alongside the virality and misinformation reports, two other types of products were produced and distributed as part of the program in November 2021: four videos and ten infographics. The objective was to reinforce the content of the training conveyed during the workshops. The content of the videos was particularly targeted toward journalists and focused on how to write and communicate fact-checks to successfully counter misinformation. The content of infographics was targeted to a larger audience and covered a range of topics related to misinformation.
Intervention Start Date
2020-06-01
Intervention End Date
2020-07-20

Primary Outcomes

Primary Outcomes (end points)
Measures of knowledge of misinformation, including 1) ability to detect misinformation, 2) knowledge of verification techniques and existing fact-checking initiatives 3) knowledge of how to write a fact-check and disseminate it.
Quality and metrics of social media posts.
Quality of journal articles.
Perceived quality of social media articles.
[See attached document for more details.]
Primary Outcomes (explanation)
The outcomes will be constructed using inverse covariance weighted indexes where applicable. See attached document for more details.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Out of a pool of approximately 350 applicants, 145 journalists were screened as eligible to participate in the program. Out of those, 73 ``treated'' journalists were randomly selected to receive an invitation to participate in the program, while 72 ``control'' were not invited. The intervention provided treated local journalists with (i) training to identify misinformation and engage in fact-checking so that they could produce stories that can combat and outcompete misinformation; (ii) seed funding to produce an original investigative journalistic content relating to misinformation; (iii) information about trending likely misinformation, as well as fact-checks recently conducted by local fact-checkers, to help journalists identify relevant topics for their regular work; and (iv) online materials advising on how to communicate fact-checks.
Three surveys were conducted: a baseline survey, a midline survey after the end of training, and an endline survey one year after the end of training. On top of the survey, social media posts and journal articles were scraped daily until the endline was completed.
Experimental Design Details
Randomization Method
Randomization is blocked based on several predetermined covariates. Treatment assignment at the block level were randomly generated by a computer using blockTools in R.
Randomization Unit
The unit of randomization is (a) the journalist, for those participants who don't share their affiliation with other journalists in the sample (127 journalists), or (b) the media outlet, for those participants who have colleagues from the same outlet in the sample (18 journalists), thus in (a) we have 127 journalists that can be assigned either to treatment and control and in (b) either all the participants who work at one of eight outlets are assigned to treatment or to control.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
We assign treatment at the level of the journalist (not clustered) for 127 journalists, and at the level of the media outlet for 18 journalists (8 clusters/media outlets)
Sample size: planned number of observations
145
Sample size (or number of clusters) by treatment arms
73 treated journalists, 72 control journalists
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Columbia Research - Human Research Protection Office Institutional Review Boards
IRB Approval Date
2021-12-01
IRB Approval Number
IRB-AAAT5193
Analysis Plan

Analysis Plan Documents

Pre-Analysis Plan. Can journalists be empowered through training and resources to counter misinformation?

MD5: e2000d3267e5e8a263ef3e5980e87b4f

SHA1: 7d5b0ebd83da99f46dd19ae3d2a41b735b975eea

Uploaded At: July 14, 2022

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials