Inefficient Hiring in Entry-Level Labor Markets

Last registered on September 12, 2016

Pre-Trial

Trial Information

General Information

Title
Inefficient Hiring in Entry-Level Labor Markets
RCT ID
AEARCTR-0001021
Initial registration date
September 12, 2016

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 12, 2016, 2:18 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Harvard University

Other Primary Investigator(s)

Additional Trial Information

Status
Completed
Start date
2010-04-25
End date
2010-12-31
Secondary IDs
Abstract
Hiring inexperienced workers generates information about their abilities. If this information is public, workers obtain its benefits. If workers cannot compensate firms for hiring them, firms will hire too few inexperienced workers. I determine the effects of hiring workers and revealing more information about their abilities through a field experiment in an online marketplace. I hired 952 randomly-selected workers, giving them either detailed or coarse public evaluations. Both hiring workers and providing more detailed evaluations substantially improved workers’ subsequent employment outcomes. Under plausible assumptions, the experiment’s market-level benefits exceeded its cost, suggesting that some experimental workers had been inefficiently unemployed.
External Link(s)

Registration Citation

Citation
Pallais, Amanda. 2016. "Inefficient Hiring in Entry-Level Labor Markets." AEA RCT Registry. September 12. https://doi.org/10.1257/rct.1021-1.0
Former Citation
Pallais, Amanda. 2016. "Inefficient Hiring in Entry-Level Labor Markets." AEA RCT Registry. September 12. https://www.socialscienceregistry.org/trials/1021/history/10594
Experimental Details

Interventions

Intervention(s)
Low-wage data-entry specialists from the online marketplace oDesk were invited to apply for 10-hour data-entry jobs. 3,767 workers who applied proposing wages of $3 per hour or less formed the experimental sample. The workers were randomly assigned to three groups: (1) a treatment group which was hired and given an evaluation like those traditionally given in the marketplace, (2) a treatment group which was hired and given a more detailed evaluation, and (3) a control group which was not hired and thus did not receive any evaluation.
Intervention Start Date
2010-04-25
Intervention End Date
2010-08-01

Primary Outcomes

Primary Outcomes (end points)
Post-experiment employment, earnings, and reservation wages of workers in all three experimental groups
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Workers were first randomized into either the control group or one of two treatment groups: the detailed evaluation or the coarse evaluation treatment group. Randomization into any treatment group was stratified on prior oDesk experience, such that workers without oDesk experience had a higher chance of being in any treatment group (32 percent) than experienced workers (15 percent). Conditional on receiving any treatment, all workers had a 50 percent chance of receiving the detailed evaluation treatment. Inexperienced workers constituted approximately three quarters of each treatment group.

The coarse evaluation treatment was designed to be equivalent to being hired (and, thus, evaluated) by a typical employer in the marketplace. The detailed evaluation treatment was identical to the coarse evaluation treatment except that it provided the market with more information about some workers’ job performance. Workers in both treatment groups were hired and given a maximum of ten hours over one week to enter the data. They were told that if, after spending ten hours on the task, they had not completed it, they should send the file back unfinished. The following objective measures of workers’ performance were used: their data entry speed, their error rate, the date they returned the data file, and three measures of whether they had followed the data entry instructions. All hired workers were rated on a one-to-five scale using a weighted average of workers’ scores on these performance measures. The distribution of scores from the job was designed to match the distribution of scores low-wage data entry workers received in the marketplace, adjusted for the fact that a worker in the sample was more likely to be inexperienced than a typical oDesk worker. The scores were calculated in the same way for workers in both treatment groups. Approximately 18 percent of workers did not return the file or log any hours. Under oDesk’s protocol, these workers were not rated. Thus, the treatments should be considered as an intent to hire.

The particular treatment group to which workers were assigned affected only the type of comment workers were eligible to receive. No workers in either treatment group received a comment if they earned a rating below three. The remaining workers in the coarse evaluation treatment received an uninformative comment. The remaining workers in the detailed evaluation treatment received a detailed comment if they scored at least a four and an uninformative comment if they scored between three and four. Workers in the detailed evaluation treatment did not know that they would receive a detailed evaluation until it was posted.

The uninformative comment was chosen to be short and positive, like most of the comments in the marketplace. The detailed comment provided objective information on the worker’s data entry speed and accuracy, whether the worker met the deadline, and whether she followed the job’s instructions. Additionally, it repeated the uninformative comment, so the only difference between the two comment types was the objective information provided in the detailed evaluation.
Experimental Design Details
Randomization Method
Randomization in office via computer
Randomization Unit
Individual randomization
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
3,767 workers
Sample size: planned number of observations
3,767 workers
Sample size (or number of clusters) by treatment arms
Treatment groups: 1) 476 workers hired and given detailed job evaluation; 2) 476 workers hired and given coarse job evaluation;
Control group: 2,815 workers not hired
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Massachusetts Institute of Technology
IRB Approval Date
2010-03-27
IRB Approval Number
1002003719

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
August 01, 2010, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
December 31, 2010, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
3767 workers
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
3767 workers
Final Sample Size (or Number of Clusters) by Treatment Arms
Treatment groups: 1) 476 workers with detailed job evaluation; 2) 476 workers with coarse job evaluation; Control group: 2,815 workers not hired
Data Publication

Data Publication

Is public data available?
Yes

Program Files

Program Files
Yes
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
Hiring inexperienced workers generates information about their abilities. If this information is public, workers obtain its benefits. If workers cannot compensate firms for hiring them, firms will hire too few inexperienced workers. I determine the effects of hiring workers and revealing more information about their abilities through a field experiment in an online marketplace. I hired 952 randomly-selected workers, giving them either detailed or coarse public evaluations. Both hiring workers and providing more detailed evaluations substantially improved workers’ subsequent employment outcomes. Under plausible assumptions, the experiment’s market-level benefits exceeded its cost, suggesting that some experimental workers had been inefficiently unemployed.
Citation
Pallais, Amanda. 2014. "Inefficient Hiring in Entry-Level Labor Markets." American Economic Review 104(11): 3565-3599.

Reports & Other Materials