x

NEW UPDATE: Completed trials may now upload and register supplementary documents (e.g. null results reports, populated pre-analysis plans, or post-trial results reports) in the Post Trial section under Reports, Papers, & Other Materials.
Information Gathering and Contracting Outcomes in Online Procurement
Last registered on July 10, 2020

Pre-Trial

Trial Information
General Information
Title
Information Gathering and Contracting Outcomes in Online Procurement
RCT ID
AEARCTR-0006114
Initial registration date
July 09, 2020
Last updated
July 10, 2020 10:01 AM EDT
Location(s)

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
Affiliation
Harvard Business School
Other Primary Investigator(s)
Additional Trial Information
Status
On going
Start date
2020-06-10
End date
2020-12-24
Secondary IDs
Abstract
Parties in procurement contracting often lack information about the capabilities or actions of their potential partners. Hiring is a special case of labor procurement where bilateral asymmetric information plays an important role in the efficiency of matching talent to appropriate opportunities (Oyer and Schaefer, 2010). Several practices may reduce information asymmetry in hiring and other procurement contexts, but the use and efficacy of these practices is unclear. Two recommendations are: 1) test applicants and stage information gathering, where a buyer observes small portions of work before committing to a larger project, and 2) introduce competition among applicants. This project seeks to understand the effects of these practices for online procurement outcomes. An additional goal is to measure whether informational treatments make buyers more likely to adopt best practices and to understand determinants of buyers’ uses of different practices.
External Link(s)
Registration Citation
Citation
Stanton, Christopher . 2020. "Information Gathering and Contracting Outcomes in Online Procurement." AEA RCT Registry. July 10. https://doi.org/10.1257/rct.6114-1.0.
Experimental Details
Interventions
Intervention(s)
Buyers in a large online market for services procurement will see different prompts that provide either control messages or suggested best practices for hiring. We will measure how these prompts change whether buyers adopt staged hiring, which entails formally defining milestones that are smaller than a full project. We will also measure whether treated buyers change the number of bidders they evaluate. Auxiliary data analysis will seek to understand how these treatments change buyers’ time use during the formal contracting/negotiation phase and subsequent time spent communicating after a bidder begins working.
Intervention Start Date
2020-06-10
Intervention End Date
2020-07-24
Primary Outcomes
Primary Outcomes (end points)
Project fill rates, log project revenue, project success, client retention, number of project milestones, number of freelancers evaluated, number of freelancers offered milestone contracts.
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Number of freelancers evaluated on future projects, number of milestones on future projects, buyer time use and extent of chat (to measure effort costs), measures of freelancer quality (to measure match composition change). The exact details of these measures are not yet known because we do not yet have access to data.
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Please see details under intervention for randomization.
Experimental Design Details
Not available
Randomization Method
In office by computer
Randomization Unit
Clustered by recruiter.
Was the treatment clustered?
Yes
Experiment Characteristics
Sample size: planned number of clusters
There are 16 individual units where treatment will be assigned, with different treatment assignment each week.
Sample size: planned number of observations
We do not have access to data to include this information, as it depends on the inflow of buyers to the site. We cannot release the planned sample size without consent of the company in a data use agreement.
Sample size (or number of clusters) by treatment arms
The expected number of clusters per treatment (recruiter-weeks) for a 4 week run-time is 21. We cannot disclose the sample size without consent of the company in a data use agreement.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
Harvard University
IRB Approval Date
2020-05-19
IRB Approval Number
IRB20-0799
Analysis Plan

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information