Back to History Current Version

Anonymity and Creativity

Last registered on November 22, 2019

Pre-Trial

Trial Information

General Information

Title
Anonymity and Creativity
RCT ID
AEARCTR-0004484
Initial registration date
November 22, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
November 22, 2019, 11:05 AM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
University of Southern Denmark

Other Primary Investigator(s)

Additional Trial Information

Status
In development
Start date
2019-11-24
End date
2020-12-31
Secondary IDs
Abstract
Research on the effect of anonymity on individual’s creativity in idea generation sessions is inconclusive: anonymous brainstorming techniques supposedly perform better since they preclude evaluation apprehension; non-anonymous brainstorming techniques supposedly perform better since they reduce free riding. In this project I suggest brainstorming with selective anonymity as a new method (anonymous brainstorming in which the identity of the idea creators is revealed after evaluation, but only for the top rated ideas). I expect that the proposed method yields more and better ideas than the two traditional methods, as it dampens the inhibitors at work in each of the other methods. I plan to use a randomized control trial to check whether this new method is more powerful than the traditional ones in terms of generating idea quantity and idea quality in idea generation sessions.
External Link(s)

Registration Citation

Citation
Schweisfurth, Tim. 2019. "Anonymity and Creativity." AEA RCT Registry. November 22. https://doi.org/10.1257/rct.4484-1.0
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2019-11-26
Intervention End Date
2020-03-31

Primary Outcomes

Primary Outcomes (end points)
Quantity of ideas
Quality of ideas (rated by raters)

Quality of ideas
Novelty per idea
Use value per idea
Average novelty per participant
Average use value per participant
Most novel idea per participant
Most usable idea per participant
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Participants will be invited to take part in a brainstorming challenge. This could be either idea development within the firm, external crowdsourcing, a hackathon, or a similar format. The brainstorming will be conducted with a digital tool, such that participants can take part using a computer or a cell phone. Ideally, all participants take part at the same time. I will provide no extrinsic incentives to participation since this might limit creativity (Amabile, 1996). Please note that even if I use a digital tool to collect ideas (Toubia, 2006), the findings are likely to be agnostic to whether the study is performed online or offline. Brainstorming methods do not work better or worse only because they are being conducted offline or offline (Pinsonneault et al., 1999a).
Before the idea creation session starts, individuals will be randomly assigned to one of the three treatments. They will then be informed about the specific brainstorming challenge which will be designed in conjunction with the program delivery organization. They are also informed that they can submit more than one idea. As an example, please find a brainstorming challenge used in Girotra et al. (2010), where the participants in the brainstorming were students:
Experimental Design Details
The data collection will be carried out in universities, where students are asked to take part in a brainstorming challenge (campus and teaching improvement),
Randomization Method
Randomization by randomizer in survey tool






Randomization Unit
Individuals
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
225 individuals
Sample size: planned number of observations
225 individuals
Sample size (or number of clusters) by treatment arms
75
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials