AI Legal Summaries and Online Discourse

Last registered on July 01, 2024

Pre-Trial

Trial Information

General Information

Title
AI Legal Summaries and Online Discourse
RCT ID
AEARCTR-0013885
Initial registration date
June 26, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
July 01, 2024, 12:07 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
UIUC

Other Primary Investigator(s)

PI Affiliation
ETH Zurich
PI Affiliation
Fordham University
PI Affiliation
Columbia University
PI Affiliation
ETH Zurich

Additional Trial Information

Status
On going
Start date
2024-06-26
End date
2025-09-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Our project facilitates public understanding of law by making legal documents transparent to the public through a novel system built with artificial intelligence. In this study, we quantify the effect of public understanding of the law on online discourse through field experiments randomizing the availability of legal document summaries. We will randomly assign some social media posts to receive accessible summaries. We will then compare on-platform engagement for treated conversations that received the accessible summaries relative to control conversations.
External Link(s)

Registration Citation

Citation
Ash, Elliott et al. 2024. "AI Legal Summaries and Online Discourse." AEA RCT Registry. July 01. https://doi.org/10.1257/rct.13885-1.0
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
To study the effect of exposure on outcomes, we will randomly embed our legal summaries in the comment sections of relevant YouTube videos.
Intervention (Hidden)
For Facts for each case, we use the facts taken directly from Oyez. For Opinions, we generate a summary from (a) the facts from Oyez, (b) the full text of the majority opinion, and (c) the full text from the dissenting opinion. The opinion summaries are created in a two-step process using the GPT-4 API, similar to the pipeline described in our earlier work Ash et al. (2024). Specifically, the process for creating the treatment text includes: 1) summarizing the facts of the case and the majority opinion and applying style transfer to summarize the most important points in the opinion and improve overall readability, and 2) combining this summary with the dissenting opinion and applying style transfer to summarize the main arguments in the dissenting opinion in the last paragraph.
Intervention Start Date
2024-06-26
Intervention End Date
2025-09-30

Primary Outcomes

Primary Outcomes (end points)
The primary outcomes are quantity of engagement, quality of engagement, and the prevalence and accuracy of legal reasoning in the discussions.
Primary Outcomes (explanation)
To measure the quantity of engagement, we will look at outcomes directly provided by the YouTube API, or can be derived with minimal postprocessing of the YouTube API output (e.g., average sentence length). To measure the quality of engagement, we will examine several measures based on theories of deliberative democracy. To measure the use of legal arguments, we will look at the use of legal jargons and similarity between comments and our text.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We will use social media bots on YouTube to track discussions of SCOTUS cases for potential injection of text into the conversation. We will select high-profile cases in Summer 2024 that will be discussed on social media. We will randomly reply to some videos with accessible summaries.
Experimental Design Details
Using a field experiment on YouTube, we study how exposure to simple and accessible summaries of facts and judicial opinions affects online discourse about Supreme Court cases.

We will use social media bots on YouTube to track discussions of SCOTUS cases for potential injection of text into the conversation. We will select high-profile cases in Summer 2024 that will be discussed on social media. For each case, we will regularly scrape videos that discuss these SCOTUS cases and randomize them into one of three groups: 1) Facts, where we post a comment with facts of the case from the judicial archive Oyez; 2) Opinions, where we post an accessible plain-English summary of the case facts, the majority opinion, and, if present, the dissenting opinions; or 3) Control, where we only track the post without making any comments.

We will look at the effect of this exposure on the behavior of other users on YouTube, such as engagement on related videos.
Randomization Method
randomization done by a computer
Randomization Unit
YouTube video
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
depends on number of videos posted on the topic; up to 10,000 videos
Sample size: planned number of observations
depends on number of videos posted on the topic; up to 10,000 videos
Sample size (or number of clusters) by treatment arms
1/3 to each of Opinions, Facts, and Control
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Columbia University
IRB Approval Date
2023-06-22
IRB Approval Number
IRB-AAAU7423
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials