The Role of Evidence in Policy Adoption

Last registered on October 17, 2018

Pre-Trial

Trial Information

General Information

Title
The Role of Evidence in Policy Adoption
RCT ID
AEARCTR-0003324
Initial registration date
September 13, 2018

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 21, 2018, 12:10 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
October 17, 2018, 7:23 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
Princeton University

Other Primary Investigator(s)

PI Affiliation
Harvard University
PI Affiliation
IDEICE & PUCMM

Additional Trial Information

Status
On going
Start date
2018-01-01
End date
2019-09-10
Secondary IDs
Abstract
This study examines whether providing information on the evidence behind a program increases its adoption by government officials. To test this, we exploit the nationwide scale-up of an education campaign in the Dominican Republic and randomly vary whether school officials receive information on the existing evidence of the impact of the program. A treatment arm with financial incentives acts as a benchmark. Further, we analyze whether technical assistance and additional reminders increase take-up.
External Link(s)

Registration Citation

Citation
Morales, Daniel, Christopher Neilson and Gautam Rao. 2018. "The Role of Evidence in Policy Adoption." AEA RCT Registry. October 17. https://doi.org/10.1257/rct.3324-2.0
Former Citation
Morales, Daniel, Christopher Neilson and Gautam Rao. 2018. "The Role of Evidence in Policy Adoption." AEA RCT Registry. October 17. https://www.socialscienceregistry.org/trials/3324/history/35842
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2018-09-10
Intervention End Date
2018-09-24

Primary Outcomes

Primary Outcomes (end points)
The primary outcome of this study is whether the school director implemented the AVE program at his school. We will attempt to measure this in two ways. First, we will measure whether the director uploads evidence of completion as required in the program instructions. Second, we will attempt to acquire independent validation of this through student surveys, although the feasibility of doing so is not clear at the time of this registration.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
The secondary outcomes - which are better thought of as intermediate steps rather than separate outcomes --are whether the director clicked on the landing page in the email, and whether they downloaded the videos.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
This study is based on the national roll-out of the Aprendiendo el Valor de la Educación (AVE) program. AVE is an at-scale evaluation of interventions that use videos to present accurate and clear information on the potential benefits and costs of schooling to 7th through 12th-grade students in the Dominican Republic. The program was piloted in 25 percent of all public middle schools in 2015 and was extended to 50 percent of all public middle schools in 2016. Berry, Coffman, Morales, and Neilson (2018) shows that AVE significantly increases test scores and reduces drop-out rates of participating schools.

The Ministry of Education of the Dominican Republic has decided to scale up the AVE program in the 2018-19 academic year. The scaled-up version will be similar to the one tested before, but its delivery to schools will change. While the school directors in the pilot were personally trained and incentivized to implement the program, the scale-up will rely on directors following an implementation protocol. All directors in the country will receive an email with links to the videos, and with precise implementation instructions. All school directors will also receive a phone call to confirm that they received the email.

The week of September 20-24 has been denominated the AVE week and has the aim of showing the AVE videos to every student enrolled in 7th grade or above. The idea is to introduce the program as part of the school’s regular acti
vities.

With the aim of maximizing the number of schools taking-up the program, we are interested in evaluating the performance of four interventions on top of the email.

In a first experiment, 50% of the directors will receive information about the results of the AVE pilot. This treatment is expected to encourage directors to implement AVE by highlighting the existing evidence of the impacts of the program. To measure spillover effects for this treatment, districts will be evenly divided into high share-districts (67%) and low share-districts (33%).

In a second experiment, 18.75% of the directors will receive a financial incentive in form of a lottery. This treatment is expected to serve as a benchmark and comprises the random distribution of 50 tablets among directors who show the videos to the students and upload a photo of the event to the AVE website (as specified in the instructions in the email).

In a third experiment, 25% of the directors will receive additional reminders from a call center organized by the Ministry of Education. This includes a second email and second call two weeks after the initial email was sent out. Finally, if the director did not download the videos by September 24, he will receive a third email and a third call on September 25, reminding him to show the videos. This intervention is denoted as “high-intensity”.

In a fourth experiment, 50% of the directors will receive information on the assistance that the ministry provides to help schools to implement the program. This includes the telephone number to a helpline and a link to a website with further information.

With the exception that the high-intensity group will never receive the financial incentive, the four treatments will be perfectly cross-randomized.
Experimental Design Details
Randomization Method
Randomization is done in office by a computer.
Randomization Unit
School
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
2,724
Sample size: planned number of observations
2,724 schools
Sample size (or number of clusters) by treatment arms
Cell 1: & letter only = 380 schools
Cell 2: & letter + information on evidence = 386 schools
Cell 3: & letter + financial incentive = 123 schools
Cell 4: & letter + high intensity = 174 schools
Cell 5: & letter + assistance = 391 schools
Cell 6: & letter + information on evidence + financial incentive = 126 schools
Cell 7: & letter + information on evidence + high intensity = 172 schools
Cell 8: & letter + information on evidence + assistance = 378 schools
Cell 9: & letter + financial incentive + assistance = 129 schools
Cell 10: & letter + high intensity + assistance = 168 schools
Cell 11: & letter + information on evidence + financial incentive + assistance = 128 schools
Cell 12: & letter + information on evidence + high intensity + assistance = 169 schools
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
Institutional Review Board, Princeton University
IRB Approval Date
2018-09-03
IRB Approval Number
10976
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials