Intervention (Hidden)
How can organizations intervene to foster the objective evaluation of novel ideas? We will examine whether changing investors’ evaluation practices affect real funding decisions for trainee and professional investors evaluating startups in the field, over time.
Everything is subject to change after we collect baseline data on the first two programs (estimated by or before June 14.)
Interventions
1) Connect will prompt consistent inquiry. When interviews are unstructured, evaluators evaluate women worse than men (Rivera 2012, 2015). In investment, equity investors tend to ask female founders more “prevention” or risk-focused questions than male founders. Investors also ask male founders more “promotion” or growth-focused questions than female founders. This is linked to worse founding outcomes for female founders (Kanze et al. 2018).
2) Connect will ask investors to evaluate using demonstrated competence. When orchestras blinded auditions, more females were hired (Goldin & Rouse 2000), perhaps because evaluators were focused on the work task rather than the appearance of candidates (Stephens et al. 2020). In equity investment, our interviews suggest that some investors focus on demonstrated competence by evaluating “what progress has been made… over time, you can definitely gather a lot of data”.
Setting the criteria in advance of evaluation can result in less biased hiring in other settings, because evaluators are more likely to use the criteria (Stephenson et al. 2020). For example, asking managers to weight their criteria before they evaluate has also been shown to reduce retroactive criteria construction and increase hiring of non-gender-normative candidates (Uhlman & Cohen 2005). To ensure that investors actually evaluate using demonstrated competence, we also ask them to apply predefined criteria.
3) As a third, and supporting intervention, Connect will share prior evaluations after the first evaluation period. Organizational transparency, “making relevant, accessible, and accurate…information available” can help to decrease inequity in real hiring outcomes (Castilla 2015: 315).
Setting
We will leverage a real investment setting, Connect. Connect is the “largest organization in the world supporting impact-driven, seed-stage startups. Since 2009 our team has directly worked with more than 1,100 entrepreneurs in 28 countries, and our affiliated fund, Connect Investments, has invested in 110 startups that have gone on to raise more than $4 billion in follow-on capital.” Connect will run eight programs, with four paired-treatment and control programs in regions: Africa, India, MENA and Latin America. Each paired-program will select at least 20 startups – with up to 24 if they wish to accept more, with at least 30% of those startups led by female founders. We will leverage this setting in two ways.
TRAINEE INVESTORS
During the program, Connect will train entrepreneurs to be trainee investors, making real investments on behalf of the program, over three months. Trainee investors will be asked to evaluate the other startups in their program (at least nine), to choose two to receive a $20,000 investment. Each trainee investor will be asked to evaluate over four evaluation periods, where the fourth evaluation determines who receives the investment. During the second, third and fourth evaluation periods, Connect will ask investors to complete due diligence, and rank companies. We will have access to 720 funding decisions by trainee investors.
Researchers will randomize trainee investors into treatment and control programs of at least ten startups each. This randomization will be stratified by region, gender, and venture subsector (so that entrepreneurs’ ventures are not competitors).
Connect will implement all three interventions we designed for the trainee investors.
1) Prompting consistent enquiry: At the end of all four evaluation periods, Connect will ask trainee investors: “what additional information would you want on this venture?” For the treatment group, Connect will ask: “what additional information would you want on how this venture’s potential for growth?”; and “what additional information would you want on how this venture will mitigate risks?”
2) Evaluating using demonstrated competence: During the second, third and fourth evaluation periods, Connect will ask investors to complete due diligence and rank companies. Connect will ask control trainee investors their normal set of evaluation questions on: “what is the company’s growth opportunity? and what is the company’s growth opportunity?” across eight categories (i.e. team, value proposition, market, scale). They will use a 4 point scale per category, resulting in a 24 point scale overall (from 4 to 32).
In the control group, after the final rank, Connect will ask investors which criteria they used as a mechanism check. “Please think about how you made your decisions and weight the criteria below with percentages of how much weight you placed on each criterion. (Please make sure it adds up to 100%!) – [Growth opportunity, Investment opportunity, Improvement made during program].
For the treatment group, Connect will add four questions, each on a 4-point scale – weighted to equal 1/3 of the overall evaluation set: “Since the beginning of the program, how much has this company improved in understanding its path to growth?”, “Since the beginning of the program, how much has this company improved in executing its path to growth?”, “Since the beginning of the program, how much has this company improved in understanding its risks?” “Since the beginning of the program, how much has this company improved in executing on risk mitigation?”
For the treatment group, before the second evaluation period (rank 1), Connect will ask investors which criteria they will use – “Please think about how you make your decisions and weight the criteria below with percentages of how much weight you would place on each criterion. (Please make sure it adds up to 100%!) – [Growth opportunity, Investment opportunity, Improvement made during program]”.
3) Sharing prior evaluations – TBD (estimated June 30, 2021).
As a supporting intervention, before the third evaluation period, evaluators will share the differences between rank by people that prioritized improvement vs. other elements. Connect managers will then re-share the importance of using improvement in evaluating companies.
Note: we will define this exact treatment based on the data we collect during rank 1. We will add to this register after rank 1 (estimated June 30, 2021).
(PROFESSIONAL INVESTORS
As a secondary population, we will observe professional investors that participate in Connect’s programming. The program will invite professional investors to meet the startups, and researchers will track progress through due diligence processes and potential eventual investment decisions for 18 months after the program. We will have access to at least 720 funding decisions by professional investors (Per 8 programs, 30 investors evaluating at least 3 startups).
1) Prompting consistent enquiry: Professional investors will receive surveys during multiple evaluation periods (before the program on deciding who to meet, after meeting the startups, six months after the program, twelve months after the program, and 18 months after the program).
Connect will ask professional investors (all mentors) in the control group: “what additional information would you want on this venture?” For the treatment group, Connect will ask: “what additional information would you want on how this venture’s potential for growth?”; and “what additional information would you want on how this venture will mitigate risks?”
Professional investors will be randomized into treatment and control groups during every evaluation period. Therefore, some professional investors will be treated more than others, which will result in different levels of treatment over time.
2) Evaluating using demonstrated competence – TBD (estimated September 30, 2021).
After the program, researchers will follow up with the investors that met the companies during the program. They will receive surveys six months after the program, twelve months after the program, and 18 months after the program.
Note: We will define this exact treatment based on the data we collect over the program, after the program ends. )