Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Anonymity and Creativity
Last registered on May 04, 2021


Trial Information
General Information
Anonymity and Creativity
Initial registration date
November 22, 2019
Last updated
May 04, 2021 6:06 AM EDT

This section is unavailable to the public. Use the button below to request access to this information.

Request Information
Primary Investigator
University of Southern Denmark
Other Primary Investigator(s)
Additional Trial Information
In development
Start date
End date
Secondary IDs
Research on the effect of anonymity on individual’s creativity in idea generation sessions is inconclusive: anonymous brainstorming techniques supposedly perform better since they preclude evaluation apprehension; non-anonymous brainstorming techniques supposedly perform better since they reduce free riding. In this project I suggest brainstorming with selective anonymity as a new method (anonymous brainstorming in which the identity of the idea creators, their ideas, and the ranking is revealed after evaluation, but only for the top rated ideas). I expect that the proposed method yields more and better ideas than the two traditional methods, as it dampens the inhibitors at work in each of the other methods. I plan to use a randomized control trial to check whether this new method is more powerful than the traditional ones in terms of generating idea quantity and idea quality in idea generation sessions.
External Link(s)
Registration Citation
Schweisfurth, Tim. 2021. "Anonymity and Creativity." AEA RCT Registry. May 04. https://doi.org/10.1257/rct.4484-1.1.
Experimental Details
Intervention Start Date
Intervention End Date
Primary Outcomes
Primary Outcomes (end points)
Quantity of ideas
Quality of ideas (rated by raters)

Quality of ideas
Novelty per idea
Use value per idea
Average novelty per participant
Average use value per participant
Most novel idea per participant
Most usable idea per participant
Primary Outcomes (explanation)
Number of ideas Number of ideas suggested by participant
Novelty per idea Novelty rated on 1-9 scale by three independent raters
Use value per idea Use value rated on 1-9 scale by three independent raters
Average novelty per participant Average rated novelty across all ideas per participant
Average use value per participant Average rated use value across all ideas per participant
Most novel idea per participant Highest novelty rating per participant
Most usable idea per participant Highest use value rating per participant
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
Participants will be invited to take part in a brainstorming challenge. This could be either idea development within the firm, external crowdsourcing, a hackathon, or a similar format. The brainstorming will be conducted with a digital tool, such that participants can take part using a computer or a cell phone. Ideally, all participants take part at the same time. I will provide no extrinsic incentives to participation since this might limit creativity (Amabile, 1996). Please note that even if I use a digital tool to collect ideas (Toubia, 2006), the findings are likely to be agnostic to whether the study is performed online or offline. Brainstorming methods do not work better or worse only because they are being conducted offline or offline (Pinsonneault et al., 1999a).
Before the idea creation session starts, individuals will be randomly assigned to one of the three treatments. They will then be informed about the specific brainstorming challenge which will be designed in conjunction with the program delivery organization. They are also informed that they can submit more than one idea. As an example, please find a brainstorming challenge used in Girotra et al. (2010), where the participants in the brainstorming were students:
Experimental Design Details
Not available
Randomization Method
Randomization by randomizer in survey tool

Randomization Unit
Was the treatment clustered?
Experiment Characteristics
Sample size: planned number of clusters
225 individuals
Sample size: planned number of observations
225 individuals
Sample size (or number of clusters) by treatment arms
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB Name
IRB Approval Date
IRB Approval Number