Management of Older Reviews

Last registered on March 21, 2022

Pre-Trial

Trial Information

General Information

Title
Management of Older Reviews
RCT ID
AEARCTR-0009113
Initial registration date
March 17, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
March 21, 2022, 1:29 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Yale University

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2022-06-01
End date
2022-07-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
The stage of this RCT is Glassdoor, which is a crowd-sourced platform where people anonymously post firm reviews. Glassdoor has a "flag" function which allow users to report the reviews that are against Glassdoor's terms of service.

This study aims to find out whether the processing time of these reviews is faster for newer reviews.
External Link(s)

Registration Citation

Citation
Gong, Ping. 2022. "Management of Older Reviews." AEA RCT Registry. March 21. https://doi.org/10.1257/rct.9113-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2022-06-01
Intervention End Date
2022-07-31

Primary Outcomes

Primary Outcomes (end points)
Processing time of these reports
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
I will carefully read Glassdoor's terms of service, and I will then identify 200 reviews that are against their terms of services. I will then separate them into older reviews and younger reviews, where older reviews are those posted 1 year+ ago. I will then randomly report 50 reviews in each group.
Experimental Design Details
Randomization Method
Randomization done in office by a computer. After getting a sample of reviews, I will randomly labelled as 0 or 1. Those reviews labelled as "1" receive treatment (i.e., these reviews will be reported to the platform's data team).
Randomization Unit
A review
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
200 reviews
Sample size: planned number of observations
200 reviews
Sample size (or number of clusters) by treatment arms
50 reviews from older reviews group receive treatment (i.e. being reported),
50 reviews from older reviews group receive nothing,
50 reviews from younger reviews group receive treatment (i.e. being reported), and
50 reviews from younger reviews group receive nothing.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials