Asymmetric Information in Labor Contracts: Evidence from an Online Experiment

Last registered on February 01, 2025

Pre-Trial

Trial Information

General Information

Title
Asymmetric Information in Labor Contracts: Evidence from an Online Experiment
RCT ID
AEARCTR-0013138
Initial registration date
August 27, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 11, 2024, 11:39 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
February 01, 2025, 2:35 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
University of Arizona

Other Primary Investigator(s)

Additional Trial Information

Status
Completed
Start date
2024-08-28
End date
2024-12-01
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
For short-term workers facing uncertain output, hourly wage contracts provide implicit insurance compared to self-employment or piece-rate pay. But like any insurance product, these contracts are prone to market distortions through moral hazard and adverse selection. Using a model of wage contracts under asymmetric information, I show how these distortions can be identified as potential outcomes in a marginal-treatment-effects framework. I apply this framework to a field experiment in which data-entry workers are offered a choice between a randomized hourly wage and a standardized piece rate. Using experimental wage offers as an instrument for hourly wage take-up identifies moral hazard, while comparisons between piece-rate workers who declined different wage offers identifies adverse selection.
External Link(s)

Registration Citation

Citation
Herbst, Daniel. 2025. "Asymmetric Information in Labor Contracts: Evidence from an Online Experiment." AEA RCT Registry. February 01. https://doi.org/10.1257/rct.13138-2.0
Experimental Details

Interventions

Intervention(s)
Intervention (Hidden)
Intervention Start Date
2024-08-28
Intervention End Date
2024-09-30

Primary Outcomes

Primary Outcomes (end points)
The main outcome of interest is the number of correctly completed sentences, normalized to be in value-per-hour terms (i.e. Y = number of sentences X $0.03 X 60min/5min).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
I will also be measuring the number of attempted (correct + incorrect) sentences.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
My experimental design offers workers a choice between a randomized hourly wage and a standardized piece rate. Comparing realized output between individuals who faced different hourly wage offers but ultimately chose the common piece rate identifies adverse selection---both groups ultimately face the same compensation scheme but made decisions under different alternative options. So, if workers choose contracts based on their privately known productivities, those who decline more generous hourly payments should perform better than those foregoing more modest wages. At the same time, because I observe worker output under both contract choices in each treatment group, a standard two-stage-least-squares estimation allows me to separately identify treatment effects of hourly wages among those who accept the offer.
Experimental Design Details
The experiment takes place on Prolific, an online platform that pays participants for completing short-term tasks or surveys. I create a data-entry position on Prolific that involves typing handwritten text into text boxes for five minutes. The job posting specifies a $1.00 reward plus a bonus of $0.03 per correctly completed sentence.

Participants who accept the job are randomly assigned to one of eighteen experimental groups. In seventeen of these groups, participants are offered a choice between the $0.03 piece rate and an alternative flat wage bonus for completing the five-minute task. Depending on the treatment group, the flat wage takes on one of the following values: $0.10, $0.15, $0.20, $0.25, $0.30, $0.35, $0.40, $0.45, $0.50, $0.60, $0.70, $0.80, $0.90, $1.00, $1.25, $1.50, or $1.75. Workers in these treatment groups can accept this flat bonus offer or accept the piece-rate bonus of $0.03 per correctly completed sentence. In addition to these treatment groups, a control group is paid the $0.03 piece rate with no flat wage alternative.

After workers choose a bonus structure, a random 50% of those opting into flat bonus offers below $1.75 receive a notification telling them they will earn a higher bonus of $1.75 instead. All other workers receive a notification confirming their choice. After this notification, workers can begin work on the task. The task requires 5 minutes of work to be completed immediately after accepting the offer. After completing their tasks, I pay each worker the amount promised by their chosen compensation scheme, plus any supplemental payments necessary to satisfy "surprise" bonus offers.
Randomization Method
Because my experimental groups have different intended sample sizes, I split these groups into 60 smaller subgroups, so that each subgroup is intended to treat the same number of participants (50). I then use my office computer to place these 60 subgroups in random order, assigning a unique URL to each subgroup based on that randomized order. Then, on the Prolific platform, each participant is routed to an experimental condition using the URL corresponding to the order in which they click the job posting. The first participant is assigned URL 1, The second participant is assigned URL 2, etc. After the first 60 participants have been assigned URLs 1 through 60, the 61st participant is assigned URL 1 and the process starts over.

Because each participants can only access the URL to which their Prolific ID is assigned, this process prevents cross-treatment contamination. Treatment assignment using other methods (e.g. randomization in Qualtrics) can be manipulated by clearing browser histories, deleting cookies, or using VPNs.
Randomization Unit
individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
3000

Note: I estimate a sample size of 3,000 participants will cost $6,600 using conservative predictions for participant take-up and performance across each treatment group. However, this estimate may be inaccurate. If piece-rate participants outperform expectations, or an unexpectedly large number of participants become eligible for the surprise top up, experimental costs may exceed my limited budget before reaching a sample size of 3,000. On the other hand, if participants underperform expectations, I may reach a sample size of 3,000 with excess funds to spare. I plan to adjust sample size as needed so that experimental costs remain between $5,000 and $8,000.
Sample size: planned number of observations
3000
Sample size (or number of clusters) by treatment arms
300 No Hourly Offer (Control)
150 $1.20/hr Offer
150 $1.20/hr Offer + Surprise Bonus
50 $1.80/hr Offer
50 $1.80/hr Offer + Surprise Bonus
50 $2.40/hr Offer
50 $2.40/hr Offer + Surprise Bonus
150 $3.00/hr Offer
150 $3.00/hr Offer + Surprise Bonus
50 $3.60/hr Offer
50 $3.60/hr Offer + Surprise Bonus
50 $4.20/hr Offer
50 $4.20/hr Offer + Surprise Bonus
50 $4.80/hr Offer
50 $4.80/hr Offer + Surprise Bonus
50 $5.40/hr Offer
50 $5.40/hr Offer + Surprise Bonus
150 $6.00/hr Offer
150 $6.00/hr Offer + Surprise Bonus
50 $7.20/hr Offer
50 $7.20/hr Offer + Surprise Bonus
50 $8.40/hr Offer
50 $8.40/hr Offer + Surprise Bonus
50 $9.60/hr Offer
50 $9.60/hr Offer + Surprise Bonus
50 $10.80/hr Offer
50 $10.80/hr Offer + Surprise Bonus
150 $12.00/hr Offer
150 $12.00/hr Offer + Surprise Bonus
50 $15.00/hr Offer
50 $15.00/hr Offer + Surprise Bonus
50 $18.00/hr Offer
50 $18.00/hr Offer + Surprise Bonus
300 $21.00/hr Offer

Note: Each participant is offered a choice between their treatment group's hourly offer (prorated to five minutes) or a $0.03 piece rate. The "Surprise Bonus" condition, which raises effective wages to parity with the $21.00/hr offer, is only applied to participants who accept their hourly wage offer. Participants in the "$3.00/hr Offer + Surprise Bonus" group who decline their hourly offer in favor of the $0.03 piece rate will never see a "surprise bonus," and their treatment will be identical to decliners in the "$3.00/hr Offer" group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Arizona Institutional Review Board (IRB)
IRB Approval Date
2024-05-16
IRB Approval Number
STUDY00003928
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
September 13, 2024, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
September 13, 2024, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
3030 participants
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
3030 participants
Final Sample Size (or Number of Clusters) by Treatment Arms
302 No Hourly Offer (Control) 150 $1.20/hr Offer 150 $1.20/hr Offer + Surprise Bonus 50 $1.80/hr Offer 51 $1.80/hr Offer + Surprise Bonus 51 $2.40/hr Offer 52 $2.40/hr Offer + Surprise Bonus 152 $3.00/hr Offer 152 $3.00/hr Offer + Surprise Bonus 50 $3.60/hr Offer 50 $3.60/hr Offer + Surprise Bonus 49 $4.20/hr Offer 50 $4.20/hr Offer + Surprise Bonus 51 $4.80/hr Offer 50 $4.80/hr Offer + Surprise Bonus 50 $5.40/hr Offer 51 $5.40/hr Offer + Surprise Bonus 152 $6.00/hr Offer 153 $6.00/hr Offer + Surprise Bonus 50 $7.20/hr Offer 50 $7.20/hr Offer + Surprise Bonus 51 $8.40/hr Offer 51 $8.40/hr Offer + Surprise Bonus 51 $9.60/hr Offer 50 $9.60/hr Offer + Surprise Bonus 50 $10.80/hr Offer 50 $10.80/hr Offer + Surprise Bonus 152 $12.00/hr Offer 153 $12.00/hr Offer + Surprise Bonus 50 $15.00/hr Offer 50 $15.00/hr Offer + Surprise Bonus 51 $18.00/hr Offer 51 $18.00/hr Offer + Surprise Bonus 304 $21.00/hr Offer (A small number of participants exited early or failed to complete the task within thirty minutes. Observations for these participants are still recorded in my sample, but the Prolific task scheduler automatically re-assigns treatment conditions to a new participant. These re-assigned treatments result in an observation count (N=3,030) that slightly exceeds my pre-registered sample size of 3,000.)
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
No
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
For workers facing uncertain output, fixed-wage contracts provide implicit insurance compared to self-employment or performance-based pay. But like any insurance product, these contracts are prone to market distortions through moral hazard and adverse selection. Using a model of wage contracts under asymmetric information, I show how these distortions can be identified as potential outcomes in a marginal treatment effects framework. I apply this framework to a field experiment in which data entry workers are offered a choice between a randomized hourly wage and a standardized piece rate. Using experimental wage offers as an instrument for hourly wage take-up, I find evidence of both moral hazard and adverse selection. Hourly wage contracts reduce worker productivity by an estimated 6.32 percent relative to the mean. Meanwhile, a 10 percent increase in the hourly wage offer attracts a marginal worker whose productivity is higher by 1.44 percent of mean worker output. I estimate the welfare loss associated with asymmetric information and calculate marginal values of public funds (MVPFs) across a range of wage-based subsidy and tax policies. My estimates suggest that a 15 percent tax on performance-based pay can efficiently raise government revenue by correcting the market inefficiencies associated with adverse selection.
Citation
Herbst, Daniel, Asymmetric Information in Wage Contracts: Evidence from an Online Experiment (February 01, 2025). Available at SSRN: https://ssrn.com/abstract=5052461 or http://dx.doi.org/10.2139/ssrn.5052461

Reports & Other Materials