Randomized Evaluation in Legal Assistance: What Difference Does Representation (Offer and Actual Use) Make?

Last registered on September 06, 2017

Pre-Trial

Trial Information

General Information

Title
Randomized Evaluation in Legal Assistance: What Difference Does Representation (Offer and Actual Use) Make?
RCT ID
AEARCTR-0001677
Initial registration date
August 31, 2017

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 06, 2017, 9:54 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Harvard Law School

Other Primary Investigator(s)

PI Affiliation
Wellesley University

Additional Trial Information

Status
Completed
Start date
2008-06-15
End date
2011-07-29
Secondary IDs
Abstract
We report the results of the first of a series of randomized evaluations of legal assistance programs. This series of evaluations is designed to measure the effect of both an offer of and the actual use of representation, although it was not possible in the first study we report here to measure constructively all effects of actual use. The results of this first evaluation are unexpected, and we caution against both overgeneralization and undergeneralization. Specifically, the offers of representation came from a law school clinic, which provided high-quality and well-respected assistance in administrative "appeals" to state administrative law judges (ALJs) of initial rulings regarding eligibility for unemployment benefits. These "appeals" were actually de novo mini-trials. Our randomized evaluation found that the offers of representation from the clinic had no statistically significant effect on the probability that unemployment claimants would prevail in their "appeals", but that the offers did delay proceedings by, on average, about two weeks. Actual use of representation (from any source) also delayed the proceeding; we could come to no firm conclusions regarding the effect of actual use of representation (from any source) on the probability that claimants would prevail. Keeping in mind the high-quality and well-respected nature of the representation the law school clinic offered and provided, we explore three possible explanations for our results, each of which has implications for delivery of legal services. We also conduct a review of previous quantitative research attempting to measure representation effects. We find that, excepting the results of two randomized studies separated by more than thirty years, this literature provides virtually no credible quantitative information on the effect of an offer of or actual use of legal representation. Finally, we discuss disadvantages, advantages, and future prospects of randomized studies in the provision of legal assistance.
External Link(s)

Registration Citation

Citation
Greiner, James and Cassandra Wolos Pattanayak. 2017. "Randomized Evaluation in Legal Assistance: What Difference Does Representation (Offer and Actual Use) Make?." AEA RCT Registry. September 06. https://doi.org/10.1257/rct.1677-1.0
Former Citation
Greiner, James and Cassandra Wolos Pattanayak. 2017. "Randomized Evaluation in Legal Assistance: What Difference Does Representation (Offer and Actual Use) Make?." AEA RCT Registry. September 06. https://www.socialscienceregistry.org/trials/1677/history/21209
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
The researchers conducted a randomized field experiment to examine the effect of an offer of legal representation on claimant outcomes in unemployment benefit hearings. Claimants who contacted a student-attorney organization via telephone were randomly assigned to receive an offer of representation from a student-attorney or to be given a list of other legal service providers. During the initial screening, claimants were informed about the experiment, and an oral consent form was completed if the client agreed to participate. For claimants who received an offer of representation and accepted, a student-attorney collected relevant documents, researched the case, prepared the client for the hearing, and represented them at the hearing. Claimants randomized to no such offer receive information about other possible legal services providers. The researchers performed the randomization and later gathered information on hearing results from the Massachusetts Division of Unemployment Assistance.
Intervention Start Date
2008-06-15
Intervention End Date
2010-05-15

Primary Outcomes

Primary Outcomes (end points)
Claimant outcome (win or lose the appeal);
Log number of days between "initial ruling" and the appeal result
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
After an initial determination on a claimant's eligibility for unemployment benefits is made and the first-level appeal stage is reached, the claimant is eligible (subject to certain requirements) to reach out to the Harvard Legal Aid Bureau (HLAB) to seek representation. Due to the time-sensitive nature of unemployment insurance system, HLAB had to make a determination within twenty-four hours whether to represent a claimant or not. After informing the claimant about the experiment and gaining their consent, oral consent verification and additional information was sent to the PIs, who randomized each case as it was received using probabilities ranging from .75 to .15 depending on the time of year. If randomized into the treatment group and offered representation that they then accepted, the claimant's case was prepared and presented by a student-attorney. If they were not selected into the treatment group, the claimant was told about other legal resources they could seek. The PIs later sent confirmation of consent and a form requesting information on the ruling in each claimant's case as well as relevant dates.
Experimental Design Details
Randomization Method
Due to the time-sensitive nature of the unemployment insurance system, the incremental intake of cases, and the seasonal nature of student-attorney availability, each case was randomized as it was received using probabilities adjusted approximately every eight weeks, ranging from .75 to .15 during the summer.
Randomization Unit
individual cases
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
207 cases
Sample size: planned number of observations
207 cases
Sample size (or number of clusters) by treatment arms
78 cases received HLAB offer (treatment), 129 cases received no offer (control)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Harvard Committee for Research on Human Subjects
IRB Approval Date
2008-06-27
IRB Approval Number
F16389

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
May 15, 2010, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
May 15, 2010, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
207 cases
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
207 cases
Final Sample Size (or Number of Clusters) by Treatment Arms
78 cases received an HLAB offer of representation (treatment); 129 cases did not receive an HLAB offer of representation
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
We report the results of the first of a series of randomized evaluations of legal assistance programs. This series of evaluations is designed to measure the effect of both an offer of and the actual use of representation, although it was not possible in the first study we report here to measure constructively all effects of actual use. The results of this first evaluation are unexpected, and we caution against both overgeneralization and undergeneralization. Specifically, the offers of representation came from a law school clinic, which provided high-quality and well-respected assistance in administrative "appeals" to state administrative law judges (ALJs) of initial rulings regarding eligibility for unemployment benefits. These "appeals" were actually de novo mini-trials. Our randomized evaluation found that the offers of representation from the clinic had no statistically significant effect on the probability that unemployment claimants would prevail in their "appeals," but that the offers did delay proceedings by, on average, about two weeks. Actual use of representation (from any source) also delayed the proceeding; we could come to no firm conclusions regarding the effect of actual use of representation (from any source) on the probability that claimants would prevail. Keeping in mind the high-quality and well-respected nature of the representation the law school clinic offered and provided, we explore three possible explanations for our results, each of which has implications for delivery of legal services. We also conduct a review of previous quantitative research attempting to measure representation effects. We find that, excepting the results of two randomized studies separated by more than thirty years, this literature provides virtually no credible quantitative information on the effect of an offer of or actual use of legal representation. Finally, we discuss disadvantages, advantages, and future prospects of randomized studies in the provision of legal assistance.
Citation
Greiner, D. James and Pattanayak, Cassandra Wolos, Randomized Evaluation in Legal Assistance: What Difference Does Representation (Offer and Actual Use) Make? (July 29, 2011). Yale Law Journal, Vol. 121, 2011.

Reports & Other Materials