Back to History Current Version

Pay-for-Performance and Bureaucratic Representation in Public Organization

Last registered on May 23, 2021

Pre-Trial

Trial Information

General Information

Title
Pay-for-Performance and Bureaucratic Representation in Public Organization
RCT ID
AEARCTR-0007632
Initial registration date
May 12, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 14, 2021, 9:38 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
May 23, 2021, 6:48 AM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
University of Copenhagen

Other Primary Investigator(s)

PI Affiliation
American University
PI Affiliation
American University

Additional Trial Information

Status
Completed
Start date
2021-05-15
End date
2021-05-22
Secondary IDs
Abstract
Many public organizations use Pay-for-Performance (PfP) schemes as a means for promoting the performance of the public services. Does the widespread use of PfP in the public sector have unintended consequences in terms of job attraction? Does PfP affect the prospects of representative bureaucracies? Using a within-subjects conjoint survey experimental design among a non-probability sample of US residents (n = 1,500), this study examines whether PfP (vs. fixed pay) make individuals of particular race, gender, and age—e.g., people of color, women, and older people—less attracted to a public sector job.
External Link(s)

Registration Citation

Citation
Favero, Nathan, Joohyung Park and Mogens Jin Pedersen. 2021. "Pay-for-Performance and Bureaucratic Representation in Public Organization." AEA RCT Registry. May 23. https://doi.org/10.1257/rct.7632-3.0
Experimental Details

Interventions

Intervention(s)
The interventions consist of a within-subject conjoint experiment embedded in an electronic survey.
The design is a paired profiles conjoint in which job arrangement profiles for two jobs—A and B—are presented next to each other in a conjoint table.

The first column of the conjoint table lists a total of eight job attributes. The second and third columns list the job attribute values for jobs A and B, respectively. All job attribute values are assigned at random.

The exact text for the eight job attributes and their respective attribute values appear below. Unless noted otherwise, attribute values are assigned with equal probability for each attribute.

Attribute 1: “Total pay: Expected pay (including bonuses), compared to similar jobs elsewhere”
Attribute values (3):
• Slightly above average
• About average
• Slightly below average

Attribute 2: “Performance bonuses: How much of the expected pay is bonuses that depend on performance”
Attribute values (4; “fixed salary” assigned with 50% probability, the others with 16,7% probability each):
• A large part of your potential pay (20%)
• A moderate part of your potential pay (10%)
• A small part of your potential pay (5%)
• No performance bonuses; fixed salary

Attribute 3: “Job performance evaluation: How your performance is measured”
Attribute values (4):
• Attendance numbers for Project HOPE events
• Satisfaction surveys of Project HOPE event participants
• Changes in community crime, poverty, and blight
• A supervisor evaluation of your work

Attribute 4: “Current community involvement: Current participation levels for the program”
Attribute values (3):
• Frequent participation
• Moderate participation
• Rare participation

Attribute 5: “Community income: Average income in target community”
Attribute values (3):
• High income
• Average income
• Low income

Attribute 6: “Community demographics: Racial/ethnic makeup of neighborhoods”
Attribute values (4):
• Mostly white
• Mostly African American
• Mostly Hispanic
• Multiracial

Attribute 7: “Overtime work: How often you will work extra evening hours”
Attribute values (3):
• Frequently required
• Occasionally required
• Never required

Attribute 8: “Key job task: Most important job qualification”
Attribute values (4):
• Analysis identifying community needs
• Teamwork with peers and supervisors
• Coordination with community groups and organizations
• Direct interaction with community residents
Intervention Start Date
2021-05-15
Intervention End Date
2021-05-22

Primary Outcomes

Primary Outcomes (end points)
Job attraction (self-reported; based on responses to a survey item)
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The conjoint survey experiment is carried out among a non-probability sample of US residents. Participants are recruited via Prolific, with enrollment limited to individuals with current residence in the US (using Prolific’s prescreening feature). In order to ensure a racially diverse sample of respondents, we will recruit 750 White respondents and 750 non-White respondents. This quota sampling approach is accomplished using Prolific’s prescreening feature. Specifically, two identical studies are created in Prolific, except that one study limits enrollment to only White participants (using Prolific’s demographic category “Ethnicity (Simplified)”) and the other study limits enrollment to non-White participants.

All survey respondents are presented with a job choice task. First, respondents are exposed to an introductory text describing the task and providing basic information about a city government job (as Community Active Worker on a community empowerment program called Project Hope). Next, respondents are exposed to a paired profiles conjoint in which two specific job arrangement profiles—A and B—for the Community Active Worker job are presented next to each other in a conjoint table.

The first column of the conjoint table lists eight job attributes. The second and third column list the job attribute values (for those eight job attributes) for jobs A and B, respectively. All job attribute values are assigned at random.

The exact job attributes and job attribute values appear under ‘INTERVENTIONS.’

As our outcome measure, all respondents are asked to indicate their choice between the two job arrangement profiles (“Which of the two jobs would you personally prefer?”). Response options are “Job A” and “Job B.”

Respondents are presented with similar paired profiles conjoints (i.e., involving the same job attributes and random assignment of job attribute values for the Community Active Worker position) two more times. Thus, each respondent will see a total of three pairs of job profiles.


Based on existing theory and research, we derive and test the following hypotheses:

H1: PfP (vs. fixed pay) affect individuals’ job attraction.

Moreover, we theorize that socio-demographic characteristics (race, gender, age) moderate the effects of PfP (vs. fixed pay) on individuals’ job attraction. In particular, we derive and test the following hypotheses:

H2a: PfP (vs. fixed pay) has negative impact on job attraction for racial minority individuals (vs. racial majority individuals)
H2b: PfP (vs. fixed pay) has negative impact on job attraction for women (vs. men)
H2c: PfP has negative impact on job attraction for older individuals (vs. younger individuals)
Experimental Design Details
The conjoint survey experiment is carried out among a non-probability sample of US residents. Participants are recruited via Prolific, with enrollment limited to individuals with current residence in the US (using Prolific’s prescreening feature). In order to ensure a racially diverse sample of respondents, we will recruit 750 White respondents and 750 non-White respondents. This quota sampling approach is accomplished using Prolific’s prescreening feature. Specifically, two identical studies are created in Prolific, except that one study limits enrollment to only White participants (using Prolific’s demographic category “Ethnicity (Simplified)”) and the other study limits enrollment to non-White participants.

All survey respondents are presented with a job choice task. First, respondents are exposed to an introductory text describing the task and providing basic information about a city government job (as Community Active Worker on a community empowerment program called Project Hope). Next, respondents are exposed to a paired profiles conjoint in which two specific job arrangement profiles—A and B—for the Community Active Worker job are presented next to each other in a conjoint table.

The first column of the conjoint table lists eight job attributes. The second and third column list the job attribute values (for those eight job attributes) for jobs A and B, respectively. All job attribute values are assigned at random.

The exact job attributes and job attribute values appear under ‘INTERVENTIONS.’

As our outcome measure, all respondents are asked to indicate their choice between the two job arrangement profiles (“Which of the two jobs would you personally prefer?”). Response options are “Job A” and “Job B.”

Respondents are presented with similar paired profiles conjoints (i.e., involving the same job attributes and random assignment of job attribute values for the Community Active Worker position) two more times. Thus, each respondent will see a total of three pairs of job profiles.


Based on existing theory and research, we derive and test the following hypotheses:

H1: PfP (vs. fixed pay) affect individuals’ job attraction.

Moreover, we theorize that socio-demographic characteristics (race, gender, age) moderate the effects of PfP (vs. fixed pay) on individuals’ job attraction. In particular, we derive and test the following hypotheses:

H2a: PfP (vs. fixed pay) has negative impact on job attraction for racial minority individuals (vs. racial majority individuals)
H2b: PfP (vs. fixed pay) has negative impact on job attraction for women (vs. men)
H2c: PfP has negative impact on job attraction for older individuals (vs. younger individuals)
Randomization Method
Randomization is carried out by simple randomization by computer (randomization based on a single sequence of random assignments).
Randomization Unit
The individual survey respondent
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
1,500 survey respondents (750 White respondents and 750 non-White respondents)
Sample size: planned number of observations
1,500 survey respondents (750 White respondents and 750 non-White respondents)
Sample size (or number of clusters) by treatment arms
Providing exact sample sizes for all potential constellations of job attribute values is less meaningful given our design and research focus. However, our main treatment of interest—PfP—have the following sample size by treatment arms:

• “A large part of your potential pay (20%)” = 1,500
• “A moderate part of your potential pay (10%)” = 1,500
• “A small part of your potential pay (5%)” = 1,500
• “No performance bonuses; fixed salary” 4,500

For our main analyses, however, we operationalize “PfP” using a binary variable capturing “Fixed salary” (n = 4,500) (0) and “PfP” (n = 4,500; all three PfP attribute values) (1).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Institutional Review Board for Protection of Human Subjects in Research (IRB) at American University
IRB Approval Date
2021-05-11
IRB Approval Number
IRB-2021-351
Analysis Plan

Analysis Plan Documents

Analysis Plan

MD5: 4c433801e1d72ed785d1d1269d588793

SHA1: 0bf5e883b9430ac978c659f299996a296d2d624c

Uploaded At: May 12, 2021

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
May 22, 2021, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
May 22, 2021, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
1,501
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
1,501 respondents
Final Sample Size (or Number of Clusters) by Treatment Arms
N/A (conjoint design)
Data Publication

Data Publication

Is public data available?
Yes

Program Files

Program Files
Yes
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials

Description
Journal article
Citation
Pedersen, M. J., Favero, N., & Park, J. (2023). Pay-for-performance, job attraction, and the prospects of bureaucratic representation in public organizations: evidence from a conjoint experiment. Public Management Review. https://doi.org/10.1080/14719037.2023.2245841