x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Self-Verification Mechanisms for Investment Decisions
Last registered on March 01, 2021

Pre-Trial

Trial Information
General Information
Title
Self-Verification Mechanisms for Investment Decisions
RCT ID
AEARCTR-0007061
Initial registration date
January 18, 2021
Last updated
March 01, 2021 3:44 AM EST
Location(s)
Region
Primary Investigator
Affiliation
Bard College Berlin
Other Primary Investigator(s)
PI Affiliation
University of Kiel
PI Affiliation
University of Kiel
Additional Trial Information
Status
Completed
Start date
2021-01-19
End date
2021-02-28
Secondary IDs
Abstract
The situation between investors and startups is typically characterized by asymmetric information inducing adverse selection and thus suboptimal investments. To mitigate inefficiencies, investors often engage in costly verification processes referred to as due diligence. Additionally, latest technological innovations help to considerably reduce transaction costs. In this project, we suggest a novel “startup investment game” experiment and a series of treatments to test for the effects of costly verification and self-verification. Depending on the results, a policy conclusion could be to support platforms based on the recent self-verification technologies.
External Link(s)
Registration Citation
Citation
Requate, Till, Aurel Stenzel and Israel Waichman. 2021. "Self-Verification Mechanisms for Investment Decisions." AEA RCT Registry. March 01. https://doi.org/10.1257/rct.7061-1.2000000000000002.
Experimental Details
Interventions
Intervention(s)
Our workhorse is an investment game. We implement five treatments with different verification mechanisms: (i) Baseline, (ii) Costly Noisy Verification, (iii) Costly Verification, (iv) Costly Self-Verification, and (v) Self-Verification
Intervention Start Date
2021-01-19
Intervention End Date
2021-02-28
Primary Outcomes
Primary Outcomes (end points)
Amount invested in the firm (conditional on success probability)
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Success probability communicated to the investor
Verification (yes / no)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Our workhorse is an investment game. We implement five treatments with different verification mechanisms: Baseline, Costly Noisy Verification, Costly Verification, Costly Self-Verification, Self-Verification”

The experiment will be conducted online using Otree (Chen et al., 2016) with participant pool from an economics department experimental lab.
Experimental Design Details
The design is explained in detail in a pdf document attached to this preregistration. The document will become available at the completion of the study.
Randomization Method
All participants are recruited from the same participant pool of an experimental economics lab.
We randomized the experimental sessions using a computer (4-5 sessions per treatment for a total of 20-25 sessions).
In each session, individual rules (investor, firm) are randomly determined by the Otree program.
Randomization Unit
Experimental session
Was the treatment clustered?
Yes
Experiment Characteristics
Sample size: planned number of clusters
Between 4 and 5 experimental sessions per treatment for 5 treatments:
A total of of 20-25 experimental sessions.
Sample size: planned number of observations
4 or 5 sessions per treatment for 5 treatments. Each session with about 20 participants: Thus, a total of 400-500 participants, where each pair of participants is an independent observations. Hence, the number of independent observations: 200-250 (i.e., between 40 and 50 per treatment for 5 treatments ).
Sample size (or number of clusters) by treatment arms
Baseline treatment´with 4-5 sessions (80-100 participants);
Costly Noisy Verification treatment with 4-5 sessions (80-100 participants);
Costly Verification treatment with 4-5 sessions (80-100 participants);
Costly Self-Verification treatment with 4-5 sessions (80-100 participants);
Self-Verification treatment with 4-5 sessions (80-100 participants)
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
IRB Approval Date
IRB Approval Number
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS