x

NEW UPDATE: Completed trials may now upload and register supplementary documents (e.g. null results reports, populated pre-analysis plans, or post-trial results reports) in the Post Trial section under Reports, Papers, & Other Materials.
On Rating Scales in Subjective Performance Evaluations – The Effect of a Dummy Category
Last registered on March 16, 2018

Pre-Trial

Trial Information
General Information
Title
On Rating Scales in Subjective Performance Evaluations – The Effect of a Dummy Category
RCT ID
AEARCTR-0002736
Initial registration date
March 16, 2018
Last updated
March 16, 2018 4:44 PM EDT
Location(s)
Region
Primary Investigator
Affiliation
University of Cologne
Other Primary Investigator(s)
PI Affiliation
University of Cologne
PI Affiliation
University of Cologne
PI Affiliation
University of Cologne
Additional Trial Information
Status
In development
Start date
2018-03-19
End date
2018-06-30
Secondary IDs
Abstract
A natural field experiment is conducted to investigate the influence of a dummy evaluation category at the bottom of a feedback scale on effort provision and performance. Subjects work on a real effort task in two successive periods. A performance dependent bonus is paid for the first period. Performance is evaluated using three evaluation categories. The first category is awarded to the highest performing subjects while the lowest performing subjects are evaluated with a three. A fixed wage is paid in the second period.

Subjects are randomly assigned to one of two treatments. In treatment ND no dummy evaluation category is shown, i.e. subjects are shown the actual three evaluation categories. In treatment CD a 4th dummy evaluation category is shown.

We hypothesize that average effort provision and performance is higher in treatment CD in the second period. We expect these effects to be strongest for those ranking lowest (third).
External Link(s)
Registration Citation
Citation
Sliwka, Dirk et al. 2018. "On Rating Scales in Subjective Performance Evaluations – The Effect of a Dummy Category." AEA RCT Registry. March 16. https://doi.org/10.1257/rct.2736-1.0.
Former Citation
Sliwka, Dirk et al. 2018. "On Rating Scales in Subjective Performance Evaluations – The Effect of a Dummy Category." AEA RCT Registry. March 16. http://www.socialscienceregistry.org/trials/2736/history/26770.
Experimental Details
Interventions
Intervention(s)
Intervention Start Date
2018-03-19
Intervention End Date
2018-06-30
Primary Outcomes
Primary Outcomes (end points)
The number of cover sheets entered correctly on the individual level (individual performance), the number of cover sheets entered on the individual level (individual effort provision), questionnaire data (post)
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
The first period is the same across treatments – subjects are asked to work on a real effort task. Subjects are informed that they receive a performance based bonus payment but no information on the evaluation scale is given.

In the second period, subjects learn about the evaluation scale and receive private feedback on their own evaluation of the performance in the first period before they can work again. Actual evaluations are based on the relative performance of subjects and follow exactly the same procedure in both treatments such that only categories 1-3 are actually awarded. Subjects are not informed about the specific details of the evaluation procedure but they learn that about 30% of subjects are awarded the best grade and 40% the second best grade. In Treatment ND they are informed that 30% of the evaluations are in category 3. In treatment CD they learn that 30% of the evaluations are either in category 3 or category 4.
We are varying whether the 4th dummy evaluation category is shown in the evaluation scale in the second period.
Experimental Design Details
A natural field experiment is conducted on Amazon MTurk. As a university department we ask subjects to update a database on class grades in two successive periods. In each period we provide 200 scanned exam cover sheets that contain six handwritten grades each. The first period is the same across treatments. After short instructions, a quiz on the task and payment structure needs to be passed. Subjects can then work for 20 minutes. Subjects are informed that a performance based bonus is paid additional to a fixed wage. Performance is defined as the number of correctly entered cover sheets. However, no information on the number of feedback categories is given. After the first period, employees are invited by e-mail to work again. When entering the second period, employees are given private feedback on the performance in the first period. Across treatments, performance is evaluated using three evaluation categories. A quiz on the task and payment structure needs to be passed to work in the second period. Working time is restricted to four days. Subjects are paid a fixed wage in the second period. We randomly assign employees to either treatment CD or control group ND stratifying treatment assignment based on the performance of the first period.
Randomization Method
Stratification method
Randomization Unit
individual
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
1000
Sample size: planned number of observations
1000
Sample size (or number of clusters) by treatment arms
Control (ND): 500 subjects
CD: 500 subjects
Note: There can be slight changes in the number of subjects in each treatment due to selective attrition in the second part. However, we document drop-outs and test whether these are systematic.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
MDES= 7 [cover sheets entered correctly], standard deviation=40 [cover sheets entered correctly] , 15.85%
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
IRB Approval Date
IRB Approval Number
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS