Unveiling Bias in Judicial Decisions: How Law School Shapes Perspectives

Last registered on April 17, 2025

Pre-Trial

Trial Information

General Information

Title
Unveiling Bias in Judicial Decisions: How Law School Shapes Perspectives
RCT ID
AEARCTR-0015465
Initial registration date
April 07, 2025

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 17, 2025, 6:20 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation

Additional Trial Information

Status
In development
Start date
2025-04-08
End date
2025-08-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The objective of this study is to better understand the origins of discrimination in the judicial system. While a large body of literature documents the existence of biases in judicial decision making, we do not know where these biases originate from. To gain a better understanding of the role of legal training, we conduct a survey experiment with German law students. Thereby, we aim to answer two main questions: First, do law students evaluate criminal cases differently if we vary certain characteristics of the perpetrator? Second, how does the answering behavior change over the course of law school?
External Link(s)

Registration Citation

Citation
Goldemann, Lennart and Maike Schlosser. 2025. "Unveiling Bias in Judicial Decisions: How Law School Shapes Perspectives." AEA RCT Registry. April 17. https://doi.org/10.1257/rct.15465-1.0
Experimental Details

Interventions

Intervention(s)
Intervention (Hidden)
Intervention Start Date
2025-04-08
Intervention End Date
2025-08-01

Primary Outcomes

Primary Outcomes (end points)
Case evaluations: Students have to decide whether they follow a stricter assessment of the case or not on a 4-point Likert Scale (yes, rather yes, rather no, no). Additionally, we construct a binary variable indicating approval ( = 1 if yes of rather yes).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Degree of penalty: Depending on their case evaluation students have to determine the severity of the penalty (length of incarceration or monetary punishment). The length of incarceration will be evaluated on a continuous scale. If the decision is between incarceration or monetary punishment, we construct a dummy variable coded as 1 if the student chooses incarceration.

Degree of (Un)certainty: After each case evaluation, we ask the students about their degree of certainty in assessing the respective case. For this, we use a slider where students can rate their certainty on a scale from 1 (extremly uncertain) to 5 (extremly certain).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We use a survey experiment to examine if certain characteristics of a perpatrator influence law students' judicial decision-making in criminal case evaluations. We collaborated with legal experts to design short criminal cases resembling those used in law school training. The survey is distributed at universities in law school courses and among law students through cooperation partners. Additionally, we conduct it at select universities ourselves. To encourage participation, students have the chance to win vouchers through a raffle.

Experimental Design Details
The core of the experiment consists of five criminal cases developed in collaboration with legal experts, each with at least two versions. In three of the cases, we vary the gender compostion of perpetrator and victim (i.e. in one version the perpetrator is male and the victim is female. In the other version, the perpetrator is female and the victim is male). In two other cases, we manipulate the perpetrator's perceived migration background by altering his name. One of the cases additionally includes a third version, resulting in: a control version, a version in which we vary the gender of the perpetrator, and a version with variation of the perceived migration background.

This design allows us to compare potential treatment effects across the two treatment dimensions. In all cases, we introduce the treatment variation through the name. Apart from the name all other aspects of the criminal cases remain unchanged. The cases are designed to be open to interpretation, allowing arguments for or against a stricter criminal offense. This provides room for students to justify their decisions in either direction. We aim to investigate whether students' case evaluations are influenced by variation in the perpetrator’s gender or perceived migration background.

Each participating law student is presented with one version of each case, solving a total of five cases. The version assigned to each student is randomized for every case.
Randomization Method
The survey experiment is conducted through the online survey provider LimeSurvey. Overall, we have five treatments (five cases). For each individual it is random which version of each case she receives. The randomization method is implemented in Lime Survey.
Randomization Unit
For each case we randomly allocate the Treatment or Control Version to an individual. The randomization unit is the individual-case.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
We contact professors and staff of 40 universities. We do not know at this point how many of them will share our survey experiment among the law students at their respective univerisities.
Sample size: planned number of observations
We contact professors and staff of 40 universities. We do not know at this point how many of them will share our survey experiment among the law students at their respective univerisities.
Sample size (or number of clusters) by treatment arms
We randomize on the individual level. Hence, each version of each case should be answered by approximately 50% of the total sample.
An exception is the case with three versions, here we expect accordingly 1/3 of the sample to have awnsered each version.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials