Can Innovators Be Created? Experimental Evidence from an Innovation Contest

Last registered on December 21, 2016

Pre-Trial

Trial Information

General Information

Title
Can Innovators Be Created? Experimental Evidence from an Innovation Contest
RCT ID
AEARCTR-0001857
Initial registration date
December 21, 2016

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
December 21, 2016, 1:45 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
UC San Diego

Other Primary Investigator(s)

PI Affiliation
UC San Diego

Additional Trial Information

Status
On going
Start date
2016-12-09
End date
2017-06-15
Secondary IDs
Abstract
Existing research on how innovation occurs largely assume that innovativeness is an inherent characteristic of an individual and that people with this innate ability select into jobs that require innovativeness. While we have some understanding of how well-designed incentives can lead innovators to take on riskier and more innovative projects, little is known about what can be done, if anything, to induce those who do not identify themselves as innovators to assume that role. In this project, we are running a natural field experiment within an innovation contest for undergraduate students at a highly ranked engineering department that will allow us to study this question. In particular, we will study whether people who may not self-select into being innovators can be induced to innovate, whether they innovate differently than those who do self-select into innovating, and whether they respond differently to confidence boosting messages.
External Link(s)

Registration Citation

Citation
Lyons, Elizabeth and Joshua Graff Zivin. 2016. "Can Innovators Be Created? Experimental Evidence from an Innovation Contest." AEA RCT Registry. December 21. https://doi.org/10.1257/rct.1857-1.0
Former Citation
Lyons, Elizabeth and Joshua Graff Zivin. 2016. "Can Innovators Be Created? Experimental Evidence from an Innovation Contest." AEA RCT Registry. December 21. https://www.socialscienceregistry.org/trials/1857/history/12768
Experimental Details

Interventions

Intervention(s)
In order to test whether those who select into innovation have different characteristics than those who do not, whether those who do not can be induced to innovate, and whether these two groups respond differently to a managerial intervention aimed at boosting confidence, we are introducing a 2-by-2 experimental design into an innovation contest for engineering and computer science undergraduate students in a top 10 US engineering department.

1. Inducement Treatment: Monetary incentive ($100) for participation in the contest. It will be offered to a randomly selected subset of the undergraduate population who are eligible for the innovation contest but who did not sign up for it. These students will have a week to take up the offer at which point the $100 will be given to all contest participants to avoid income effects. This treatment will over-target females who make up 23% of the study population and who we expect to be less likely to select into the contest, and to benefit relatively more from the managerial intervention treatment.

2. Managerial Intervention Treatment: Weekly confidence boosting emails send to subset of both the self-selected innovators (those who signed up for the contest without the cash inducement) and the induced innovators.

We therefore have 4 conditions: 1. Not induced and no managerial intervention; 2. Induced and no managerial intervention; 3. Not induced and managerial intervention; 4. Induced and managerial intervention.
Intervention Start Date
2017-02-01
Intervention End Date
2017-05-31

Primary Outcomes

Primary Outcomes (end points)
Signed up for contest, submitted an app by contest deadline (conditional on signing up), quality of submitted app (conditional on signing up)
Primary Outcomes (explanation)
Quality of submitted app will be assessed by a panel of expert judges who are also the people that select the problem the app is supposed to solve. Quality will be judged based on novelty, functionality, user friendliness, and commercial value.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
To address our research questions, we are implementing an RCT within an innovation contest for a subset of students who are arguably most at risk of entering innovative careers. In particular, we intend to introduce an innovation contest open to all undergraduate engineering and computer science students in a top US engineering department for which participants will be required to develop a mobile phone or web application that solves a specific problem assigned by the contest. The problem was chosen by technology managers and entrepreneurs in collaboration with the authors.

Students will be invited to enroll in the contest and will have approximately three months to develop their application. To attract contest participants, in cooperation with the Engineering and Computer Science Department, we will advertise the contest through email blasts and weekly departmental digital newsletters, and hold info-session. Students will have until a pre-specified date to sign up. The application submission deadline will be set for three months after the sign-up deadline. After the sign-up deadline has past, we will email a randomly selected subsample of engineering and computer science students who did not sign up to offer them $100 for participation in the contest. We will over sample on females to ensure we have a large enough sample of females to analyze their outcomes. They will have until the end of that week to accept this offer. This incentive is our inducement treatment, which is designed to ‘create’ innovators from a sample that did not self-identify as such. After this second sign-up deadline has passed, participants will receive the problem they will need to solve with their applications. All participants will receive the $100 offered to the induced population to eliminate concerns about income effects.

When signing up for the contest, students will be asked to complete a survey that asks them their gender, study major, GPA, year of study, and whether or not they've previously participated in an innovation contest. Although we cannot collect this data from the population of students who do not sign up for the contest, we will have aggregate data gender, and GPA by major in order to determine whether students who participate look different on average than the population they were drawn from.

During the contest, a randomly selected sub-sample of participants will receive weekly confidence boosting emails. This treatment will allow us to test whether self-selected innovators and induced innovators respond differently to an important innovation management practice.

Applications submitted by the contest deadline will be evaluated by a panel of expert evaluators, including those who chose the problem to be solved by the app, to test the applications and score each on their user friendliness, novelty, effectiveness, and commercial value. The first place winner will be awarded $5,000, second place will win $2,000, and third place will win $1,000.

Based on prior innovation contests run at the school, we expect the initial sign up to be 150 students. We will then target 1,000 students with the inducement treatment from which we expect a take-up of about 15%.

Our analysis will consider whether and how induced innovators perform differently than self-selected innovators, and whether the encouragement emails affected these groups differently. We will also analyze whether these treatments affect males and females differently.
Experimental Design Details
Randomization Method
Randomization done by a computer, over sample females for inducement treatment
Randomization Unit
Student
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
300 students
Sample size: planned number of observations
300 students
Sample size (or number of clusters) by treatment arms
75 students not induced & no managerial intervention, 75 students induced & no managerial intervention, 75 students not induced & managerial intervention, 75 students induced & managerial intervention
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
UC San Diego Human Research Protections Program
IRB Approval Date
2016-11-01
IRB Approval Number
Project # 161649

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials