Pledge and Agree: The Path to Cooperation

Last registered on February 21, 2020

Pre-Trial

Trial Information

General Information

Title
Pledge and Agree: The Path to Cooperation
RCT ID
AEARCTR-0005339
Initial registration date
February 20, 2020

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
February 21, 2020, 11:55 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
MIT

Other Primary Investigator(s)

PI Affiliation
State University of New York (SUNY) at Albany
PI Affiliation
Harvard University

Additional Trial Information

Status
In development
Start date
2020-02-26
End date
2020-03-26
Secondary IDs
Abstract
The prisoner’s dilemma game has the deeply discouraging implication that cooperation will be impossible to achieve in one-time encounters. Fortunately, players in the real world have found ways to engage in cooperative endeavors in just such situations. Here we argue that parties can solve this dilemma through a two-step tatonnement process where they first pledge their intentions. A deal is reached if there is reciprocal agreement on the pledges and breaking one’s word is more costly than merely not cooperating.
Our analysis accomplishes two goals: First, using traditional game-theoretic tools, we demonstrate that cooperation emerges naturally as a subgame perfect equilibrium (SPE) in a pledge-and-agree framework. Often it is the only SPE; in other contexts, its properties make it the logical choice among SPEs. Second, this analysis employs experiments to test whether parties actually reach cooperative outcomes when allowed to pledge and agree before taking action.
External Link(s)

Registration Citation

Citation
ARECHAR, ANTONIO, CUICUI CHEN and RICHARD ZECKHAUSER. 2020. "Pledge and Agree: The Path to Cooperation." AEA RCT Registry. February 21. https://doi.org/10.1257/rct.5339-1.0
Experimental Details

Interventions

Intervention(s)
We invite participants on Amazon Mechanical Turk to take part in an incentivized one-shot prisoner's dilemma game with an anonymous other party. The game is alternatively displayed in the pledge and agree framework outlined above, or in its standard form. Alternative payoffs are also employed. In the two versions, non-agreement is superior to cooperating when the other party does not in one and inferior in the other. Participants will be randomly assigned to alternative conditions.
Intervention Start Date
2020-02-26
Intervention End Date
2020-03-26

Primary Outcomes

Primary Outcomes (end points)
Overall cooperation across conditions.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We have a between-subjects design. We invite participants to take part in an incentivized one-shot game with an anonymous other. Their random assignment will be to one of ten conditions. Two conditions present a standard prisoner's dilemma scenario and establish our baselines for the two sets of payoffs. The remaining eight conditions are variations of the game that incorporate our pledge and agree design, and allow us to experimentally test its predictions.
Experimental Design Details
The first condition is a standard prisoner's dilemma; the second condition introduces the pledge and agree structure of the game, and offers participants a low, or gloomy, outside option in case agreement is not reached; the third conditions is similar to the previous one but the outside option is relatively high, or attractive; the fourth and fifth conditions are similar to the previous two, but include a preliminary, and hypothetical stage where participants state, in an strategy-method fashion, their intended actions in light of facing each of the possible scenarios in the game. The last five conditions are identical to the previous ones but with expected payoffs multiplied by three.
Randomization Method
Randomization is done by a computer.
Randomization Unit
Individual.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
1,500 participants.
Sample size: planned number of observations
1,500 participants.
Sample size (or number of clusters) by treatment arms
150 participants in each of the ten conditions.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
MIT COUHES
IRB Approval Date
2018-06-13
IRB Approval Number
1806392996
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials