The Effects of Evaluator Type and Incentive Structure on Creativity

Last registered on July 13, 2020

Pre-Trial

Trial Information

General Information

Title
The Effects of Evaluator Type and Incentive Structure on Creativity
RCT ID
AEARCTR-0005443
Initial registration date
February 10, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
February 11, 2020, 1:49 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
July 13, 2020, 5:52 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
University of Arizona

Other Primary Investigator(s)

PI Affiliation
Tulane University
PI Affiliation
Tulane University

Additional Trial Information

Status
Completed
Start date
2020-02-10
End date
2020-03-20
Secondary IDs
Abstract
Managers often try to encourage their employees to generate and share creative ideas as to how to increase firm performance and improve job satisfaction. Soliciting creative ideas can be difficult for a number of reasons discussed in the literature. Some research suggests that subordinates may be reluctant to convey certain types of creative ideas to their superiors, for fear of scrutiny. In addition, there is mixed evidence as to whether incentives drive or curb creativity. If incentives can be used to successfully solicit creative ideas, an open question is how the risk structure of such incentives affects a worker’s likelihood to generate and share their ideas. We will conduct a field experiment in a nationwide education program to determine how the type of evaluator and proposed incentive structures affect the creativity of ideas shared by employees.
External Link(s)

Registration Citation

Citation
Bol, Jasmijn, Lisa LaViers and Jason Sandvik. 2020. "The Effects of Evaluator Type and Incentive Structure on Creativity." AEA RCT Registry. July 13. https://doi.org/10.1257/rct.5443-1.1
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2020-02-10
Intervention End Date
2020-03-20

Primary Outcomes

Primary Outcomes (end points)
The number of ideas generated by participants. The average and variance of the novelty and usefulness of each idea. The overall creativity of each idea. Overall participation rates.
Primary Outcomes (explanation)
Examining the number of ideas generated will provide us with an initial proxy of the level of creative effort exerted by participants. The scholarship on creativity defines an idea as creative if it is both novel and useful. The novelty and usefulness of an idea is subjective by nature. Corporate executives and other teachers and staff may have different assessments of the novelty and usefulness of a given idea. The novelty and usefulness (i.e., the creativity) of each idea will be judged by corporate executives and other staff members/teachers. Finally, while every teacher and staff member will be encouraged to share their ideas, some may choose not to. The decision to or not to participate may be driven by the exogenously varied parameters of the experiment, so participation rates will be a useful dependent variable in our analysis.

Secondary Outcomes

Secondary Outcomes (end points)
Survey responses that capture employees’ self-reported levels of connectedness, creativity, and risk aversion.
Secondary Outcomes (explanation)
Self-reported measures of connectedness, creativity, and risk aversion are hypothesized to be forward looking measures of productivity, retention, and willingness to participate in contests.

Experimental Design

Experimental Design
Teachers and staff members will be asked to complete a single survey that will enter each of them into two separate contests. The first contest will prompt participants to share their best idea that will “help [the firm] increase the number of students enrolled into music classes,” and the second contest will prompt participants to share their best idea that will “help [the firm] do something better.” The ideas will be evaluated based on their creativity, which has two parts, novelty and usefulness. The parameters of the contests will vary in two ways: (1) who the evaluators of the ideas are and (2) the incentive structure for sharing ideas.

Participants will be told that they will be competing against approximately 50 others. They will be told that their ideas will be evaluated either by a panel of top-level executives or by a panel of fellow teachers/staff members (their peers). In each of the two contests, the participant with the most creative idea will be recognized nationally. In some treatment arms, the participant with the most creative idea will receive a $250 prize. In other treatment arms, the ten participants with the ten most creative ideas will receive $25 prizes.

After being told the parameters (who their evaluators will be and whether they are competing for one $250 prize or ten $25 prizes), participants will be prompted to answer the two contest questions. They will then be asked to answer several questions about their tenure with the firm, work experience in education, habits for idea generation, connectedness, creativity, risk aversion, and demographics. All participants who complete the survey once will receive a $5 participation fee. Participants will be allowed to take the survey multiple times, but they will only receive the $5 participation fee for the first submission. Ideas from each additional survey submission will be entered into the contests for national recognition and prize money.

All submitted ideas will be de-identified when individuals from the two different groups read and evaluate the creativity of the ideas. Corporate executives and other (non-competing) teachers/staff members will use a 100-point scale to judge the novelty, usefulness, and creativity of each idea. The ideas with the best average creativity score will determine the winners of each contest, and prizes (Visa gift cards) will be awarded to the winners (participation fee Visa gift cards will be sent out to all participants at this time, too). The randomization procedure to place participants into treatment arms is as follows:

• There are three treatment arms: (1) Executive Evaluation with one $250 prize; (2) Executive Evaluation with ten $25 prizes; (3) Peer Evaluation with one $250 prize.
• There are 92 offices clustered in 27 different states/provinces (called regions) in North America.
• We will randomly allocate the 27 regions into one of the three treatment arms, making sure there is reasonable balance in the number of offices and eligible participants in each treatment arm.
• We will allocate individuals into treatment arms at the region-level to attenuate the possibility that workers in different treatment arms communicate with one another about the different parameters of their respective contest.

Teachers and staff members will not be required to complete the survey. Participation is voluntary, so the final number of participants/ideas in each treatment arm will vary. Participation rates, however, are a useful dependent variable for creative idea generation, as all teachers and staff members at participating locations will be invited to participate in the survey and told the parameters of the competition that pertain to their office.
Experimental Design Details
Randomization Method
Randomization done in office by a computer
Randomization Unit
State/Province (which we call "Region")
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
27 Regions
Sample size: planned number of observations
120-360 Ideas
Sample size (or number of clusters) by treatment arms
Regions will be randomized into one of the three treatment arms such that each treatment arm has approximately the same number of regions, offices, and eligible participants. There are 92 offices total, spread across 27 regions. We anticipate that 172 workers will be eligible to participate. Some offices have over 30 people, whereas others contain only a single individual. We estimate that each treatment arm will contain approximately 50-60 eligible workers.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials

Documents

Document Name
Extension into M'turk Setting
Document Type
proposal
Document Description
Due to the onset of COVID-19, our original data collection was disrupted, as many of the office locations were in large cities that were severely impacted by the spread of the infection. We hope to restart data collection in the future.

The attached proposal describes how we are extending our experimental design into a different setting, using M'turk online labor market participants in a field experiment. The design closely parallels that of the original pre-registered experiment, but slight modifications have been made to accommodate the new setting. We have received approval by the Tulane IRB to perform this experiment (under the same approval number).
File
Extension into M'turk Setting

MD5: 8ec4c40cc6166003e7f7b97106c2b7d5

SHA1: 3286affb1a111ec82656312a83f7c1da64bf405c

Uploaded At: July 13, 2020

IRB

Institutional Review Boards (IRBs)

IRB Name
Tulane University Social-Behavioral IRB
IRB Approval Date
2020-02-04
IRB Approval Number
2020-011

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials