Statistical Literacy in the Newsroom: Experimental Evidence

Last registered on July 23, 2024

Pre-Trial

Trial Information

General Information

Title
Statistical Literacy in the Newsroom: Experimental Evidence
RCT ID
AEARCTR-0014037
Initial registration date
July 18, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 23, 2024, 12:33 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
ifo Institute for Economic Research

Other Primary Investigator(s)

PI Affiliation
ifo Institute for Economic Research
PI Affiliation
University of Cologne

Additional Trial Information

Status
In development
Start date
2024-07-23
End date
2024-08-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Quality journalism can make scientific publications and statistics accessible for the general public. Yet, numerous studies reveal that many journalists lack proficiency in statistical analysis. To improve the statistical literacy of journalists, we have developed a concise educational video focusing on the accurate interpretation of scientific studies and statistics. The video addresses key topics such as (random) sampling, computation and interpretation of probabilities, differences between correlation and causation, and identification of misleading figures.
To assess the efficacy of the video, we recruit trained journalists through a mailing list of the German Journalistic Association. These journalists are randomly assigned to either the treatment or a control group (between-subject design). The treatment group receives the video immediately, the control group only after the experiment. All participants are asked to assess several news articles on scientific studies in terms of their accuracy. Computing differences in means between treatment and control group gives an estimate about the efficacy of the video.
External Link(s)

Registration Citation

Citation
Berger, Lara Marie, Anna Kerkhof and Nikola Noske. 2024. "Statistical Literacy in the Newsroom: Experimental Evidence." AEA RCT Registry. July 23. https://doi.org/10.1257/rct.14037-1.0
Experimental Details

Interventions

Intervention(s)
Educational video (see attachment)
Intervention (Hidden)
Intervention Start Date
2024-07-23
Intervention End Date
2024-08-31

Primary Outcomes

Primary Outcomes (end points)
1) Presence of reporting errors in the articles on scientific studies
2) Absolute number of errors per article
3) Type of errors: i) correlation vs. causation, ii) lack of context, iii) misinterpretation of statistics, iv) sample not representative /selection, v) other
Primary Outcomes (explanation)
1) Presence of reporting errors per article: binary variable. We will compute the mean number of correct assessments per journalist.
2) Absolute number of errors per article: Count variable. Compute deviation from correct number of mistakes. Then compute mean per participant.
3) Type of errors in each article: Multiple Choice. We compute the mean of correctly identified types of mistakes.

Secondary Outcomes

Secondary Outcomes (end points)
Individual title
Secondary Outcomes (explanation)
We ask participants to choose a hypothetical title for an article that they would write about the scientific studies that we show them. We use NLP techniques (e.g., dictionary) to assess their luridness.

Experimental Design

Experimental Design
Survey flow:
We invite around 300 journalists via a mailing list of the German Journalistic Association and ask if they are interested in participating in our study. Those who are interested will be randomly assigned to the treatment group or the control group. Journalists in the treatment group see our educational video (see attachment). Journalists in the control group will see the video after the trial.
Next, we ask all participants to read two news articles covering scientific studies. We also offer them to read the original scientific studies through a hyperlink. For each participant, we randomly draw two studies (plus two corresponding news articles, i.e., four news articles in total) from a pool of five studies. We display only the most relevant parts of the news articles (e.g., title, teaser, beginning), but the entire news articles are available through drop-down menues. For all news articles, the authors have pre-assessed the type and number of reporting mistakes (if any), following a codebook (see attachment). The pre-assessment is pre-registered here (see attachment), but will not be visible to our participants.
Then, for each news article, we ask our participants to assess whether there are mistakes in the reporting or not (e.g., misinterpretation of statistics), how many mistakes there are, and which types of mistakes (see "primary outcomes"). Then we ask them to suggest a hypothetical alternative title for an article covering the respective study (see "secondary outcomes").
After the trial, journalists in the control group receive access to the educational video, too.

Analysis:
In our main analysis, we will compare the treatment to the control group through simple OLS regressions with each of our outcomes on the LHS, respectively, and a dummy for being in the treatment group (or not) on the RHS. The parameter that measures the impact of the treatment indicator is our parameter of interest. Control variables include gender, age, and political orientation.
Experimental Design Details
Randomization Method
randomization through computer
Randomization Unit
individual journalists
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
300 journalists,
Note: If we no not reach the planned number of observations, we will close the survey two weeks after sending out the invitations.
Sample size: planned number of observations
300 journalists
Sample size (or number of clusters) by treatment arms
150 journalists per treatment group
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
IRB

Institutional Review Boards (IRBs)

IRB Name
Ethical Review Board of the Faculty of Management, Economics, and Social Sciences
IRB Approval Date
2024-02-05
IRB Approval Number
240007LB

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials