Asymmetric Information in Labor Contracts: Evidence from an Online Experiment

Last registered on September 11, 2024

Pre-Trial

Trial Information

General Information

Title
Asymmetric Information in Labor Contracts: Evidence from an Online Experiment
RCT ID
AEARCTR-0013138
Initial registration date
August 27, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 11, 2024, 11:39 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
University of Arizona

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2024-08-28
End date
2024-12-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
For short-term workers facing uncertain output, hourly wage contracts provide implicit insurance compared to self-employment or piece-rate pay. But like any insurance product, these contracts are prone to market distortions through moral hazard and adverse selection. Using a model of wage contracts under asymmetric information, I show how these distortions can be identified as potential outcomes in a marginal-treatment-effects framework. I apply this framework to a field experiment in which data-entry workers are offered a choice between a randomized hourly wage and a standardized piece rate. Using experimental wage offers as an instrument for hourly wage take-up identifies moral hazard, while comparisons between piece-rate workers who declined different wage offers identifies adverse selection.
External Link(s)

Registration Citation

Citation
Herbst, Daniel. 2024. "Asymmetric Information in Labor Contracts: Evidence from an Online Experiment." AEA RCT Registry. September 11. https://doi.org/10.1257/rct.13138-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2024-08-28
Intervention End Date
2024-09-30

Primary Outcomes

Primary Outcomes (end points)
The main outcome of interest is the number of correctly completed sentences, normalized to be in value-per-hour terms (i.e. Y = number of sentences X $0.03 X 60min/5min).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
I will also be measuring the number of attempted (correct + incorrect) sentences.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
My experimental design offers workers a choice between a randomized hourly wage and a standardized piece rate. Comparing realized output between individuals who faced different hourly wage offers but ultimately chose the common piece rate identifies adverse selection---both groups ultimately face the same compensation scheme but made decisions under different alternative options. So, if workers choose contracts based on their privately known productivities, those who decline more generous hourly payments should perform better than those foregoing more modest wages. At the same time, because I observe worker output under both contract choices in each treatment group, a standard two-stage-least-squares estimation allows me to separately identify treatment effects of hourly wages among those who accept the offer.
Experimental Design Details
Not available
Randomization Method
Because my experimental groups have different intended sample sizes, I split these groups into 60 smaller subgroups, so that each subgroup is intended to treat the same number of participants (50). I then use my office computer to place these 60 subgroups in random order, assigning a unique URL to each subgroup based on that randomized order. Then, on the Prolific platform, each participant is routed to an experimental condition using the URL corresponding to the order in which they click the job posting. The first participant is assigned URL 1, The second participant is assigned URL 2, etc. After the first 60 participants have been assigned URLs 1 through 60, the 61st participant is assigned URL 1 and the process starts over.

Because each participants can only access the URL to which their Prolific ID is assigned, this process prevents cross-treatment contamination. Treatment assignment using other methods (e.g. randomization in Qualtrics) can be manipulated by clearing browser histories, deleting cookies, or using VPNs.
Randomization Unit
individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
3000

Note: I estimate a sample size of 3,000 participants will cost $6,600 using conservative predictions for participant take-up and performance across each treatment group. However, this estimate may be inaccurate. If piece-rate participants outperform expectations, or an unexpectedly large number of participants become eligible for the surprise top up, experimental costs may exceed my limited budget before reaching a sample size of 3,000. On the other hand, if participants underperform expectations, I may reach a sample size of 3,000 with excess funds to spare. I plan to adjust sample size as needed so that experimental costs remain between $5,000 and $8,000.
Sample size: planned number of observations
3000
Sample size (or number of clusters) by treatment arms
300 No Hourly Offer (Control)
150 $1.20/hr Offer
150 $1.20/hr Offer + Surprise Bonus
50 $1.80/hr Offer
50 $1.80/hr Offer + Surprise Bonus
50 $2.40/hr Offer
50 $2.40/hr Offer + Surprise Bonus
150 $3.00/hr Offer
150 $3.00/hr Offer + Surprise Bonus
50 $3.60/hr Offer
50 $3.60/hr Offer + Surprise Bonus
50 $4.20/hr Offer
50 $4.20/hr Offer + Surprise Bonus
50 $4.80/hr Offer
50 $4.80/hr Offer + Surprise Bonus
50 $5.40/hr Offer
50 $5.40/hr Offer + Surprise Bonus
150 $6.00/hr Offer
150 $6.00/hr Offer + Surprise Bonus
50 $7.20/hr Offer
50 $7.20/hr Offer + Surprise Bonus
50 $8.40/hr Offer
50 $8.40/hr Offer + Surprise Bonus
50 $9.60/hr Offer
50 $9.60/hr Offer + Surprise Bonus
50 $10.80/hr Offer
50 $10.80/hr Offer + Surprise Bonus
150 $12.00/hr Offer
150 $12.00/hr Offer + Surprise Bonus
50 $15.00/hr Offer
50 $15.00/hr Offer + Surprise Bonus
50 $18.00/hr Offer
50 $18.00/hr Offer + Surprise Bonus
300 $21.00/hr Offer

Note: Each participant is offered a choice between their treatment group's hourly offer (prorated to five minutes) or a $0.03 piece rate. The "Surprise Bonus" condition, which raises effective wages to parity with the $21.00/hr offer, is only applied to participants who accept their hourly wage offer. Participants in the "$3.00/hr Offer + Surprise Bonus" group who decline their hourly offer in favor of the $0.03 piece rate will never see a "surprise bonus," and their treatment will be identical to decliners in the "$3.00/hr Offer" group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Arizona Institutional Review Board (IRB)
IRB Approval Date
2024-05-16
IRB Approval Number
STUDY00003928
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information