How Research Affects Policy: Evidence from a Demand and Beliefs Experiment

Last registered on June 17, 2019

Pre-Trial

Trial Information

General Information

Title
How Research Affects Policy: Evidence from a Demand and Beliefs Experiment
RCT ID
AEARCTR-0004274
Initial registration date
June 03, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
June 17, 2019, 11:16 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
Harvard University

Other Primary Investigator(s)

PI Affiliation
PUC Rio
PI Affiliation
University of California, Davis
PI Affiliation
Columbia

Additional Trial Information

Status
Completed
Start date
2017-05-15
End date
2019-06-01
Secondary IDs
Abstract
We carry out an experiment with more than 900 municipal officials (mayors, council members, and municipal secretaries) in Brazil to measure their demand for research information and to identify what types of research findings change their beliefs.
External Link(s)

Registration Citation

Citation
Hjort, Jonas et al. 2019. "How Research Affects Policy: Evidence from a Demand and Beliefs Experiment." AEA RCT Registry. June 17. https://doi.org/10.1257/rct.4274-1.0
Former Citation
Hjort, Jonas et al. 2019. "How Research Affects Policy: Evidence from a Demand and Beliefs Experiment." AEA RCT Registry. June 17. https://www.socialscienceregistry.org/trials/4274/history/48243
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2017-05-15
Intervention End Date
2018-05-24

Primary Outcomes

Primary Outcomes (end points)
Beliefs about policy effectiveness; Willingness-To-Pay for research studies
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Participants of a series of national and regional conferences of municipalities (mayors and other municipal officials) were invited to answer a tablet interactive survey where information on Early-Childhood Development (ECD) programs was presented. Participants were asked to elicit their prior beliefs about likely effect sizes of an ECD program. We offered randomly-selected research studies information for purchase. Then, asked participants to elicit their Willingness-To-Pay (in terms of lottery tickets for a trip to the US) to learn the estimated effect sizes from research studies. After the random price was drawn, the studies’ results were revealed (or not). Finally, we measured posterior beliefs about likely effect sizes, and offered a policy implementation report describing how to implement and ECD program for purchase.
Experimental Design Details
Randomization Method
Randomization by computer
Randomization Unit
Individual, Individual x Round
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
900
Sample size: planned number of observations
900 individuals, multiple observations per individual depending on the outcome measure
Sample size (or number of clusters) by treatment arms
We don't have discrete treatment arms as such. Participants are offered multiple randomly-selected studies sequentially in multiple rounds.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials