Real-effort laboratory experiment on the effects of production uncertainty on agent effort allocation among two inputs.

Last registered on May 14, 2021

Pre-Trial

Trial Information

General Information

Title
Real-effort laboratory experiment on the effects of production uncertainty on agent effort allocation among two inputs.
RCT ID
AEARCTR-0007665
Initial registration date
May 11, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 14, 2021, 9:36 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
United States Military Academy

Other Primary Investigator(s)

Additional Trial Information

Status
Completed
Start date
2016-10-17
End date
2018-08-01
Secondary IDs
Abstract
Incentives in complex jobs — like education and health— have had mixed results in developed countries. Both these jobs have complex production functions. One way to describe job production complexity is by allowing agents to have uncertainty about how their inputs translate into productivity. When coupled with large output incentives, such uncertainty can induce agents to inefficiently shift their effort allocation to inputs with lower uncertainty, even at the cost of reducing average output. This experiment tests that hypothesis using a two-input real effort lab experiment. Participants are paid based on the number of easy and hard math questions they answer correctly. In treatment, I increase the uncertainty about the marginal payoff for inputs. The experiment is intended to test whether increased marginal uncertainty induces agents to inefficiently switch their effort allocation.
External Link(s)

Registration Citation

Citation
Phipps, Aaron. 2021. "Real-effort laboratory experiment on the effects of production uncertainty on agent effort allocation among two inputs.." AEA RCT Registry. May 14. https://doi.org/10.1257/rct.7665-1.0
Experimental Details

Interventions

Intervention(s)
Participants use a web browser to engage in addition tasks while a proctor controls the flow of the experiment in an administrative dashboard. The basic innovation is to present participants with a choice of two possible tasks (inputs). In the allowed time, they attempt to successfully complete either task as many times as possible. The two tasks have different difficulty and financial payoffs.
Intervention Start Date
2016-10-17
Intervention End Date
2018-08-01

Primary Outcomes

Primary Outcomes (end points)
I measure how participants allocate their time between the two inputs. Based on their performance, I know which input is most effective for a participant. I can then see if the treatment (production uncertainty) induces them to inefficiently switch away from their effective input.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Each participant is presented both a control session and a treatment session (in randomized order). In treatment, the payoffs for each input are randomized. There are two treatment arms: one in which the easy question's payoffs have high variance, and the other in which the hard question's payoffs have high variance.
Experimental Design Details
Randomization Method
Randomization done by a computer at the beginning of each laboratory session. Randomization ensures equal split between treatment arms and the control/treatment session order. Math problems are randomly generated prior to the experiment so that all participants have the same problems.
Randomization Unit
Randomized at the individual level within each experimental session. No clustering required because all participants experience treatment and control.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
60 individuals.
Sample size: planned number of observations
60 individuals.
Sample size (or number of clusters) by treatment arms
30 with treatment Easy High Variance
30 with treatment Hard High Variance
All 60 participants have control and treatment.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Institutional Review Board for Social and Behavioral Sciences, University of Virginia
IRB Approval Date
2016-10-17
IRB Approval Number
2016-0415

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
August 01, 2018, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
October 16, 2017, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
28 participants with treatment and control each
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials