Keep your eyes on the prize: How to make penalty contracts work

Last registered on April 14, 2021

Pre-Trial

Trial Information

General Information

Title
Keep your eyes on the prize: How to make penalty contracts work
RCT ID
AEARCTR-0007520
Initial registration date
April 14, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
April 14, 2021, 11:08 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region
Region

Primary Investigator

Affiliation
University of Bern

Other Primary Investigator(s)

PI Affiliation
University of Bern
PI Affiliation
University of Bern
PI Affiliation
University of Bern

Additional Trial Information

Status
In development
Start date
2020-11-01
End date
2022-01-31
Secondary IDs
Abstract
This online experiment examines the influence of payment modalities on productivity when working under framed incentive contracts. We investigate the effect of payment visualization to make the payment situation more salient to the participants. We employ a 2x2 design with four different treatments varying the incentive contract (bonus vs. penalty) and the visualization (visualized vs. description only). Participants work on a real effort task and have to meet a predefined performance target. If they meet the target, participants in the bonus treatment receive an additional payment. In the penalty treatment, participants receive a higher payment upfront, but can lose part of it, if they do not meet the target. Both contracts are payoff-equivalent. We vary whether the payment options are visualized on screen with dollar bills and coins, or are only explained using a verbal description.
External Link(s)

Registration Citation

Citation
Essl, Andrea et al. 2021. "Keep your eyes on the prize: How to make penalty contracts work ." AEA RCT Registry. April 14. https://doi.org/10.1257/rct.7520-1.0
Experimental Details

Interventions

Intervention(s)
Online experiment on Amazon’s Mechanical Turk (MTurk). 2x2 between-subject design. Participants are randomized at the individual level. In the main stage of the "Bonus" treatments the payment specifies a base pay plus a bonus if the individual meets a predefined target. In the main stage of the "Penalty" treatments the payment specifies a higher base pay including the incentive pay, which has to be paid back (penalty) if the individual does not meet the target. Both contracts are payoff-equivalent. In the "Visualization" treatments images of USD bills / coins are shown on screen. In the "Verba"l treatments a verbal explanation without visualization of the possible outcomes is used.

Intervention Start Date
2021-04-18
Intervention End Date
2021-05-31

Primary Outcomes

Primary Outcomes (end points)
Productivity measured as the number of correct answers in the real-effort task.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Success measured by whether the pre-defined performance target is met (bonus received/ penalty not received) or not (bonus not received/penalty received).
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The experiment consists of three parts and a questionnaire. In Part 1, we assess loss aversion based on a lottery task. In Part 2, participants receive a fixed wage and work on a real effort encryption task (Encryption task with Double Randomization). Participants have to encode letters into numbers and can proceed with the next word only if they encoded all letters correctly. Effort is measured as the number of solved words. In Part 3, the main stage of the experiment, participants also work on the WEDR task. This time, payment is performance based and tied to a target. Depending on the treatment, the payment is framed as a bonus or penalty and the payment is visualized or not visualized (see above).
Experimental Design Details
Randomization Method
Computer (online experiment)
Randomization Unit
Individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
The number of participants planned for the experiment is 1440 people completing the study. We advertise for 1600 participants to take into account individuals excluded due to the restrictions below.

We will exclude subjects that:
- do not complete the MTurk study within 45 minutes of starting;
- are not approved for any other reason (e.g. not having a valid MTurk ID);
- do not complete a single task in Part 2
- do not complete a single task in Part 3

In pilot data, we had to exclude about 10 percent of participants because they did not meet the restrictions above.

In addition, and in line with previous research, we will analyze the data once excluding those who make inconsistent decisions in the loss aversion test and once including these subjects.

The study will be kept open on MTurk until 1600 subjects have completed the assignment.
Sample size: planned number of observations
1440 participants
Sample size (or number of clusters) by treatment arms
About 400 participants per treatment (about 360 participants per treatment after excluding participants as described above)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Based on a two-sided Wilcoxon-Mann-Whitney test, an error probability of 0.05, and a power of 0.80, we require about 360 participants per treatment to detect an effect of Cohen’s d of 0.21. This is comparable to the treatment effect identified in a pilot study.
IRB

Institutional Review Boards (IRBs)

IRB Name
Ethics Committee of the Faculty of Business Administration, Economics and Social Sciences of the University of Bern
IRB Approval Date
2021-02-16
IRB Approval Number
062021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials