Back to History Current Version

Keep your eyes on the prize: How to make penalty contracts work 2

Last registered on May 10, 2021

Pre-Trial

Trial Information

General Information

Title
Keep your eyes on the prize: How to make penalty contracts work 2
RCT ID
AEARCTR-0007650
Initial registration date
May 10, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 10, 2021, 11:45 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region
Region

Primary Investigator

Affiliation
University of Bern

Other Primary Investigator(s)

PI Affiliation
University of Bern
PI Affiliation
University of Bern

Additional Trial Information

Status
In development
Start date
2021-05-01
End date
2022-06-30
Secondary IDs
Abstract
This online experiment examines the influence of payment modalities on productivity when working under framed incentive contracts. We investigate the effect of payment visualization to make the payment situation more salient to the participants. We employ a 2x2 design with four different treatments varying the incentive contract (bonus vs. penalty) and the visualization (visualized vs. description only). Participants work on a real effort task. At the end, one task is randomly selected for payment. If the selected task is solved correctly, participants in the bonus treatment receive an additional payment. In the penalty treatment, participants receive a higher payment upfront, but can lose part of it, if the selected task is solved incorrectly. Both contracts are payoff-equivalent. We vary whether the payment options are visualized on screen with dollar bills and coins, or are only explained using a verbal description.
External Link(s)

Registration Citation

Citation
Essl, Andrea, Stefanie Jaussi and Frauke von Bieberstein. 2021. "Keep your eyes on the prize: How to make penalty contracts work 2." AEA RCT Registry. May 10. https://doi.org/10.1257/rct.7650-1.0
Experimental Details

Interventions

Intervention(s)
Online experiment on Amazon’s Mechanical Turk (MTurk). 2x2 between-subject design. Participants are randomized at the individual level. In the main stage of the Bonus treatments the payment specifies a base pay plus a bonus if the individual solves the randomly selected task correctly. In the main stage of the Penalty treatments the payment specifies a higher base pay including the incentive pay, which has to be paid back (penalty) if the individual solves the selected task incorrectly. Both contracts are payoff-equivalent. In the Visualization treatments images of USD bills / coins are shown on screen. In the Verbal treatments a verbal explanation without visualization of the possible outcomes is used.
Intervention (Hidden)
Intervention Start Date
2021-05-17
Intervention End Date
2021-06-30

Primary Outcomes

Primary Outcomes (end points)
Productivity measured as the number of correct answers in the real-effort task.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The experiment consists of three parts and a questionnaire. In Part 1, we assess loss aversion based on a lottery task. In Part 2, participants receive a fixed wage and work on a real effort encryption task (Encryption task with Double Randomization). Participants have to encode letters into numbers and can proceed with the next word only if they encoded all letters correctly. Effort is measured as the number of solved words. In Part 3, the main stage of the experiment, participants also work on the WEDR task. This time, payment is performance based and tied to a randomly selected task. Depending on the treatment, the payment is framed as a bonus or penalty and the payment is visualized or not visualized (see above).
Experimental Design Details
Randomization Method
computer (online experiment)
Randomization Unit
individual
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
The number of participants planned for the experiment is 1440 people completing the study. We advertise for 1600 participants to take into account individuals excluded due to the restrictions below.

We will exclude subjects who do not complete the MTurk study within 45 minutes of starting and who are not approved for any other reason (e.g. not having a valid MTurk ID);

We will analyze the data both, for all participants and for those participants, who solve at least one task correctly.

In addition, and in line with previous research, we will analyze the data once excluding those who make inconsistent decisions in the loss aversion test and once including these subjects.

The study will be kept open on MTurk until 1600 subjects have completed the assignment.
Sample size: planned number of observations
The number of participants planned for the experiment is 1440 people completing the study. We advertise for 1600 participants to take into account individuals excluded due to the restrictions below.
Sample size (or number of clusters) by treatment arms
About 400 participants per treatment (about 360 participants per treatment after excluding participants as described above)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Based on a two-sided Wilcoxon-Mann-Whitney test, an error probability of 0.05, and a power of 0.80, we require about 360 participants per treatment to detect an effect of Cohen’s d of 0.21. This is comparable to the treatment effect identified in a pilot study.
IRB

Institutional Review Boards (IRBs)

IRB Name
Ethics Committee of the Faculty of Business Administration, Economics and Social Sciences of the University of Bern
IRB Approval Date
2021-02-16
IRB Approval Number
062021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials