Back to History Current Version

Incentive Contracts, Effort Costs, and Productivity in Teams and Individuals: Experimental Evidence from Computer Programmers

Last registered on March 18, 2019

Pre-Trial

Trial Information

General Information

Title
Incentive Contracts, Effort Costs, and Productivity in Teams and Individuals: Experimental Evidence from Computer Programmers
RCT ID
AEARCTR-0003405
Initial registration date
October 09, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 11, 2018, 7:16 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
March 18, 2019, 5:20 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
UC San Diego

Other Primary Investigator(s)

PI Affiliation
Cornell University
PI Affiliation
UC San Diego

Additional Trial Information

Status
On going
Start date
2018-10-19
End date
2019-05-31
Secondary IDs
Abstract
Existing research demonstrates sizable negative effects of high temperatures on labor productivity across skill types. In this study, we use experimental variation in room temperature to examine how the effects of incentive structure and work arrangements vary with higher costs of effort induced by higher temperatures. Motivated by findings that higher temperatures make coordination more difficult, our first objective is to first address how higher temperatures differentially affect team based production. Second, we test a potential intervention for overcoming the negative productivity impacts of high temperature. Specifically, we test the effectiveness of bonus rates that reflect the increasing cost of effort over time in high temperatures for overcoming the negative productivity impacts of warmth in both individual and team-based production.
External Link(s)

Registration Citation

Citation
Garg, Teevrat, Maulik Jagnani and Elizabeth Lyons. 2019. "Incentive Contracts, Effort Costs, and Productivity in Teams and Individuals: Experimental Evidence from Computer Programmers." AEA RCT Registry. March 18. https://doi.org/10.1257/rct.3405-3.0
Former Citation
Garg, Teevrat, Maulik Jagnani and Elizabeth Lyons. 2019. "Incentive Contracts, Effort Costs, and Productivity in Teams and Individuals: Experimental Evidence from Computer Programmers." AEA RCT Registry. March 18. https://www.socialscienceregistry.org/trials/3405/history/43632
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
We will study our research question on computer science students hired to complete a computer programming task. We will randomize the temperature of the room they perform the task in, the incentive bonuses they are offered, and whether the task is completed individually or in teams.

Our study has 8 treatment groups as summarized below:

1. Individual based production, low room temperature, constant bonus rate
2. Individual based production, low room temperature, increasing bonus rate
3. Individual based production, high room temperature, constant bonus rate
4. Individual based production, high room temperature, increasing bonus rate
5. Team based production, low room temperature, constant bonus rate
6. Team based production, low room temperature, increasing bonus rate
7. Team based production, high room temperature, constant bonus rate
8. Team based production, high room temperature, increasing bonus rate
Intervention Start Date
2018-10-19
Intervention End Date
2018-12-15

Primary Outcomes

Primary Outcomes (end points)
Programmer effort
Programmer performance
Programmer productivity
Primary Outcomes (explanation)
Programmer effort will be measured as number of key strokes, mouse clicks, and keyboard strikes in total and over time.
Programmer performance will be measured as total number of successfully completed features, and total number of successfully completed features weighted by number of total feature submissions.
Programmer productivity will be measured as the number of features successfully completed divided by the sum of the total number of clicks and characters typed.

Secondary Outcomes

Secondary Outcomes (end points)
Programmer self-reported satisfaction with performance
Programmer measure of task difficulty
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
To test our research questions, we will hire undergraduate computer science students enrolled in 4-year college programs in Dhaka to complete a computer-programming job that asks programmers to implement five feature additions to an existing Java-based script. Participants will be given four and a half hours to complete the task. All features can be implemented independently of each other and are designed to be of similar difficulty. Layered onto this program is task tracking that allows us to track effort over time by collecting data on key strokes, pixels scrolled, and clicks participants make each minute. The program also has a tool that automatically checks the functionality of features upon submission, and programmers can submit each feature multiple times until they have confirmation that they have completed it correctly. The number of times each feature is submitted to be evaluated by the programmer is also stored by the program.

We chose the population of Dhaka computer science undergraduate students for our study for several reasons. First, Dhaka experiences very high temperatures in summer; the average maximum temperature in September is 32 degrees Celsius and the average minimum is not much lower at 26 degrees Celsius. Second, the IT sector contributes significantly to Bangladesh’s economy; in 2017, it was estimated that the IT sector in Bangladesh was worth $600 million and accounted for about 250,000 jobs . Ensuring IT worker productivity is maintained as temperatures rise is, therefore, likely to be important to the continued economic development of the country.

Our randomized control trials involve a series of treatments that will be randomly assigned across session-room temperatures. To generate temperature variation in our study, we will randomly assign the air conditioner temperature in each room that participants are working in to be 23 degrees or 29 degrees Celsius (73.4 degrees Fahrenheit or 84.2 degrees Fahrenheit). We will run two session-rooms at a time with 10 participants per room for both the individual sessions and the team sessions. We have selected the temperature values 23 degrees or 29 degrees to generate sufficient temperature variation across participants, and to ensure that we are not subjecting participants to uncomfortably high temperatures.

In addition to temperature variation, we will vary how the bonuses are allocated across sessions. In half of all sessions, participants will receive a constant bonus rate for each feature successfully implemented as determined by the automated testing tool. The bonus will be the will be equal to 9% of their total salary, which is 13USD/1100BDT, allowing them to earn an additional 45% of their salary in bonuses. In the other half of all sessions, participants receive an increasing bonus rate for each feature successfully implemented. The bonus will increase for each feature implemented by 3 percentage points, and will start at 3% of salary. Therefore, participants in both bonus structure treatments can earn the same total amount in bonuses. This treatment is intended to establish whether cost of effort increases faster when temperatures are higher such that workers benefit more from an increasing than a constant pay-for-performance rate over time when they are in warmer temperatures.

Lastly, we will randomly vary whether the task is to be completed in pairs of two programmers or individually. Within the team sessions, pairs will be randomly assigned and will be instructed to work on the task according to a pair programming format where members take turns being the driver (typing) and navigator (coming up with solutions). With 10 participants per session room, each session room assigned to the team treatment will have 5 teams. This treatment is intended to establish whether the impacts of higher temperature differ when team work is required relative to when it is not. In the team sessions, bonuses will be paid for joint performance such that if a team successfully implements a feature, both team members will receive an equivalent bonus. Both team and independent sessions will be randomly assigned the different incentive designs.
Experimental Design Details
Randomization Method
Treatment conditions will be randomized across sessions days using coin flips
Participants will be randomly assigned to session days using Stata's generate uniform distribution number assignment command (gen uniform())
Randomization Unit
We have two levels of randomization to first ensure session types are randomly assigned across outside air temperatures and air qualities, and second to ensure participants are not self-selecting into particular day-types or to session days also being attended by someone they know.
Randomization units:
1. Experimental sessions
2. Individual participants
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
N/A
Sample size: planned number of observations
200 work-unit observations (100 individual workers, 100 pairs of workers) or 300 workers in total.
Sample size (or number of clusters) by treatment arms
25 individuals in Individual Production, Constant Bonus, Cool Temp; 25 individuals in Individual Production, Increasing Bonus, Cool Temp; 25 individuals in Individual Production, Constant Bonus, Warm Temp; 25 individuals in Individual Production, Increasing Bonus, Warm Temp; 25 teams in Individual Production, Constant Bonus, Cool Temp; 25 teams in Individual Production, Increasing Bonus, Cool Temp; 25 teams in Individual Production, Constant Bonus, Warm Temp; 25 teams in Individual Production, Increasing Bonus, Warm Temp
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Using data from a pilot we ran to ensure the computer program was running effectively, we observed a mean number of total characters typed among programmers working individually of 13999.55 (standard deviation of 8609.67). With 50 individuals each in the hot and cool rooms, to detect a significant effect 80% of the time, we need a mean difference in number of total characters typed of 4,872, or about 35% change in effort relative to the baseline mean. The mean number of features completed in total was 2.21 (sd of 0.713) so to detect a significant effect on programmer performance 80% of the time, our mean detectable effect size for the effect of temperature on individual productivity of 0.404 features.
IRB

Institutional Review Boards (IRBs)

IRB Name
UCSD Human Research Protections Program
IRB Approval Date
2018-09-07
IRB Approval Number
180486

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials