x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Contract framing: Keep your eyes on the prize
Last registered on August 18, 2020

Pre-Trial

Trial Information
General Information
Title
Contract framing: Keep your eyes on the prize
RCT ID
AEARCTR-0006299
Initial registration date
August 17, 2020
Last updated
August 18, 2020 2:34 AM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
University of Bern
Other Primary Investigator(s)
PI Affiliation
University of Bern
PI Affiliation
University of Bern
Additional Trial Information
Status
On going
Start date
2020-06-01
End date
2021-12-31
Secondary IDs
Abstract
This online experiment examines the influence of payment modalities on productivity when working under framed incentive contracts. We investigate the effect of payment visualization to make the payment situation more salient to the participants. We employ a 2x2 design with four different treatments varying the incentive contract (bonus vs. penalty) and the visualization (visualized dollar bills vs. description only). Participants work on 15 real effort tasks. At the end, one task is randomly selected for payment. If the selected task is solved correctly, participants in the bonus treatment receive an additional payment. In the penalty treatment, participants receive a higher payment upfront, but can lose part of it, if the selected task is solved incorrectly. Both contracts are payoff-equivalent. We vary whether the payment options are visualized on screen with dollar bills while participants work on the task, or are only explained using a verbal description.
External Link(s)
Registration Citation
Citation
Essl, Andrea, Stefanie Jaussi and Frauke von Bieberstein. 2020. "Contract framing: Keep your eyes on the prize." AEA RCT Registry. August 18. https://doi.org/10.1257/rct.6299-1.1.
Experimental Details
Interventions
Intervention(s)
Online experiment on Amazon’s Mechanical Turk (MTurk). 2x2 between-subject design. Participants are randomized at the individual level. In the Bonus treatments the payment specifies a base pay of 1 USD plus a bonus of 1 USD if the individual solves the selected task correctly. In the Penalty treatments the payment specifies a base pay of 2 USD minus a penalty of 1 USD if the individual solves the selected task incorrectly. In the Visualization treatments images of USD bills are shown on screen. In the Description treatments a verbal explanation without visualization of the possible outcomes is used.
Intervention Start Date
2020-07-30
Intervention End Date
2020-09-30
Primary Outcomes
Primary Outcomes (end points)
Productivity (i.e. number of correct answers) in the real-effort task. We will analyze the data both, for all participants and for those participants, who solve at least one of the 15 tasks correctly.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Online experiment on Amazon’s Mechanical Turk (MTurk). 2x2 between-subject design. Participants are randomized at the individual level. In the Bonus treatments the payment specifies a base pay of 1 USD plus a bonus of 1 USD if the individual solves the selected task correctly. In the Penalty treatments the payment specifies a base pay of 2 USD minus a penalty of 1 USD if the individual solves the selected task incorrectly. In the Visualization treatments images of USD bills are shown on screen. In the Description treatments a verbal explanation without visualization of the possible outcomes is used.
Experimental Design Details
Not available
Randomization Method
Computer (online experiment)
Randomization Unit
individual
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
In total about 800 participants
Sample size: planned number of observations
In total about 800 participants
Sample size (or number of clusters) by treatment arms
About 200 participants per treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
IRB Approval Date
IRB Approval Number