x

Please fill out this short user survey of only 3 questions in order to help us improve the site. We appreciate your feedback!
Promoting an evidence-based culture: an experiment with Latin-American policymakers
Last registered on July 15, 2018

Pre-Trial

Trial Information
General Information
Title
Promoting an evidence-based culture: an experiment with Latin-American policymakers
RCT ID
AEARCTR-0003136
Initial registration date
July 09, 2018
Last updated
July 15, 2018 12:28 PM EDT
Location(s)
Primary Investigator
Affiliation
CAF - Development Bank of Latin America
Other Primary Investigator(s)
PI Affiliation
CAF - development of Latin-America
PI Affiliation
Universidad del Rosario
Additional Trial Information
Status
On going
Start date
2017-03-22
End date
2019-02-01
Secondary IDs
Abstract
This study intends to identify the effects of a two day face-to-face training program, the Seminar of Impact Evaluation for Development (SEMIDE), that aims to promote impact evaluation (IE) as a tool for enhancing public management in Latin-America. This seminar was designed and implemented by the Impact Evaluation and Policy Learning Department of CAF - Development Bank of Latin-America - and targets Latin-American policymakers with the potential of promoting institutional changes within their institutions. The election of the participants consists of a competitive process that ends with a score for each candidate, followed by a randomized selection of the 35 places among the top 100 candidates. The seminar structure is divided into two sections for each day: 1) Presentations of impact evaluation cases from both academic experts and policymakers, and 2) Practical sessions where participants get the chance to land and apply IE knowledge. We intend to evaluate whether attending this seminar had a positive effect on participants in terms of the promotion and implementation of IEs inside their institutions, their perceptions about the utility of IE, long-lasting knowledge of IE concepts, and other variables. Two editions took place on 2017, in Peru (May 2017) and Argentina (November 2017). The first stage of this study focuses on the Peruvian edition, one year after it took place. We ultimately aim to conduct a multisite experiment that comprise findings on both editions, and potentially include others that are expected to occur during the next two years (most likely in Ecuador and Colombia).
Registration Citation
Citation
CAF, Pilar, Jorge Gallego and Daniel Ortega. 2018. "Promoting an evidence-based culture: an experiment with Latin-American policymakers." AEA RCT Registry. July 15. https://doi.org/10.1257/rct.3136-1.0.
Former Citation
CAF, Pilar et al. 2018. "Promoting an evidence-based culture: an experiment with Latin-American policymakers." AEA RCT Registry. July 15. http://www.socialscienceregistry.org/trials/3136/history/31807.
Sponsors & Partners

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information
Experimental Details
Interventions
Intervention(s)
The intervention consisted in attending a two-day face-to-face seminar that discussed how Impact Evaluation (IE) can be a useful tool for policymakers to enhance public management in their institutions. The seminar structure was divided into two sections per day. First, both invited academics and policymakers delivered presentations that covered basic IE concepts and methods and/or personal cases where IE helped them to resolve a particular public policy issue and to improve decision making. The second section consisted of workshops where participants could learn how to identify key elements to assess the feasibility of a potential IE in their own projects, clarify their doubts regarding IE, and to work on an IE proposal which was presented to their peers.
Intervention Start Date
2017-05-25
Intervention End Date
2017-11-24
Primary Outcomes
Primary Outcomes (end points)
-IE knowledge score on survey test
-Use of evidence generated by IE
-Perception on the importance of IE
-Knowledge and use of evidence websites
-Engagement level on different stages of an IE process
-Number of IEs implemented/supported
Primary Outcomes (explanation)
Secondary Outcomes
Secondary Outcomes (end points)
Secondary Outcomes (explanation)
Experimental Design
Experimental Design
For Peru's edition, each applicant got a score after the three-step application process for attending the seminar (online application assesment, video recording and interview). The top 100 were randomly assigned either to attend the seminar or to the control group (treatment=35, control=65). following a blocking strategy based on the score distribution. Three strata were defined: 1) From the first to the 33rd position, 2) From the 34th to the 90th position; and 3) From the 91st to the 100th position. The participants were then randomly selected within each block, based on the weight of each stratum in the final sample. For Argentina's edition, we conducted a simple randomization among the top 81 applicants, as their scores after the three-step application process were highly homogenous (for this edition, these three-steps were online application assesment, interview and score on an IE online course). Randomization design for the potential Ecuador and Colombia SEMIDEs will be detailed at a later stage of this multisite experiment.
Experimental Design Details
Randomization Method
Peru's edition: Blocking randomization
Argentina's edition: simple randomization
Both done by a computer.
Randomization Unit
Applicant
Was the treatment clustered?
No
Experiment Characteristics
Sample size: planned number of clusters
-Peru's edition: One cluster of 100 applicants
-Argentina's edition: One cluster of 81 applicants
Sample size: planned number of observations
Peru's edition: 100 applicants Argentina's edition: 81 applicants
Sample size (or number of clusters) by treatment arms
Peru's edition:
-Treatment: 35 applicants
-Control: 65 applicants
Argentina's edition:
-Treatment: 35 applicants
-Control: 46 applicants
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB
INSTITUTIONAL REVIEW BOARDS (IRBs)
IRB Name
IRB Approval Date
IRB Approval Number
Post-Trial
Post Trial Information
Study Withdrawal
Intervention
Is the intervention completed?
No
Is data collection complete?
Data Publication
Data Publication
Is public data available?
No
Program Files
Program Files
Reports, Papers & Other Materials
Relevant Paper(s)
REPORTS & OTHER MATERIALS