Subjective Promotions

Last registered on May 11, 2026

Pre-Trial

Trial Information

General Information

Title
Subjective Promotions
RCT ID
AEARCTR-0018519
Initial registration date
May 07, 2026

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
May 11, 2026, 9:18 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Primary Investigator

Affiliation
University of Fribourg

Other Primary Investigator(s)

PI Affiliation
PI Affiliation

Additional Trial Information

Status
In development
Start date
2026-04-29
End date
2027-12-31
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
We consider a rank-order tournament model involving a tournament administrator (employer or principal) and two agents competing for the prize (the promotion), and test experimentally the principal's inferences and the agents' responses. Specifically, one agent has an exogenous advantage, which is known and observable to the principal. The principal's incentives are designed such that it's in their best interest to design a promotion rule that levels the playing field ex post. We will test whether i) the principals promote optimally and if not, ii) agents anticipate this distortion and adapt effort accordingly. This pre-registration extends pre-registration #14254.
External Link(s)

Registration Citation

Citation
Gomez Martinez, Francisco , Holger Herz and Christian Zihlmann. 2026. "Subjective Promotions." AEA RCT Registry. May 11. https://doi.org/10.1257/rct.18519-1.0
Experimental Details

Interventions

Intervention(s)
The interventions in the HUMAN condition, in which principals are played by a subject and winning rule determined ex post (no commitment):
1) In Baseline (SYM), the agents are fully symmetric.
2) In Treatment (ASYM), the agents are asymmetric.

The interventions in the COMPUTER conditio, in which the winning rule is fixed ex ante and known to agents (commitment). Note: SYM: Symmetric Agents, ASYM: Asymmetric agents.
1) EO SYM: Symmetric agents, winning rule is set to 0, meaning agent with higher work output wins. Equal opportunities.
2) EO ASYM : Asymmetric agents, winning rule is set to 16, meaning agent with higher performance, but not necessarily work output wins. Equal opportunities, affirmative action implemented.
2) UO ASYM : Asymmetric agents, winning rule is set to 0, meaning agent with higher work output wins. Unequal opportunities since agents are asymmetric, so no affirmative action implemented.
Intervention Start Date
2026-05-01
Intervention End Date
2026-05-31

Primary Outcomes

Primary Outcomes (end points)
HUMAN:
1) Principal's choice of the promotion rule, in which they decide how to assign the high
prize as a function of the difference of the output of agents (scale: -100 to 100).
2) Beliefs of agents regarding principal's choice of the allocation rule (scale: -100 to
100)
3) Agent's effort provision (effort choice 0 to 100).
COMPUTER:
1) Agent's effort provision (effort choice 0 to 100).
Primary Outcomes (explanation)
The outcomes are directly observable and will not be constructed.

Secondary Outcomes

Secondary Outcomes (end points)
The outcomes are directly observable and will not be constructed.
Secondary Outcomes (explanation)
The outcomes are directly observable and will not be constructed.

Experimental Design

Experimental Design
See below.
Experimental Design Details
Not available
Randomization Method
First, participants randomly draw a card that assigns them to their computer / cubicle. This determines whether they are a principal or agent (in COMPUTER: whether they are agent 1 or 2); and in which of the conditions they are, also in which matching group of 9 participants they are (cluster). HUMAN: In each round, a group of 1 principal and 2 agents will be formed randomly by the computer (repeated random re-matching). COMPUTER : In each round, a group of 2 agents will be formed randomly by the computer (repeated random re-matching). Finally, the agent's error term (noise) is randomly drawn by the computer in each round, too.
Randomization Unit
Technically, randomization happens at the unit level (drawing a card at the individual level, and treatments are within-session randomized). We will cluster error terms on matching group level, since there might be correlated error terms within matching group (in case within-session randomization is not feasible due to participant numbers).
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
HUMAN:
Matching groups of 9, which means 216/9=24 clusters (matching groups) on which the SE's will be clustered on.
COMPUTER:
48 subjects per treatment (matching groups of 6, so 8 clusters).
Sample size: planned number of observations
HUMAN: 216 participants in total, 72 agent type 1, 72 agent type 2, 72 principals. COMPUTER: 144 participants in total, 48 per treatment, (72 agent type 1, 72 agent type 2, no principals)
Sample size (or number of clusters) by treatment arms
HUMAN: We will collect 216 participants. Given that we have sessions of the size of 18
participants, this will be 216 participants in total. This yields 24 matching groups à 9
participants; and 72 prinicipals, 72 agents of type 1, and 72 agents of type 2.
There will be 2 treatments. Thus, 216/2=108 participants per treatment arm. Of those, in
each treatment: 36 principals, 36 agents type 1, 36 agents type 2.

COMPUTER: 144 participants in total, 48 per treatment, (72 agent type 1, 72 agent type 2, no principals). Matching groups of 6, so 8 clusters per treatment.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB (SES) of the University of Fribourg
IRB Approval Date
2024-03-26
IRB Approval Number
No. 2024-03-01