Training Policymakers in Econometrics: Evidence from Two Experiments in Pakistan

Last registered on December 13, 2022


Trial Information

General Information

Training Policymakers in Econometrics: Evidence from Two Experiments in Pakistan
Initial registration date
December 04, 2022

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
December 13, 2022, 10:40 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.



Primary Investigator

New Economic School Moscow

Other Primary Investigator(s)

PI Affiliation
PI Affiliation

Additional Trial Information

On going
Start date
End date
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Over the last half century empirical economics has gone through a paradigm shift. The credibility revolution, with its careful attention to causality, has presented itself as a new paradigm for taking the "con out of econometrics”. We hope to study causal effects of a paradigm shift in the social sciences on practitioners–policymakers–using the training of the paradigm as its instrument. There seems to be consensus emerging in the literature that policymakers are highly averse to shifting their beliefs and engage in motivated reasoning to justify their initial policy choices. Sticking to priors and being inattentive to evidence may stymie the implementation of good policies that might otherwise spur economic development. How can policymakers be made more receptive to evidence? Will training them in concepts associated with the credibility revolution make them more likely to shift their beliefs? Will it induce them to change their policy choices? Can the econometrics training impact State capacity? Can it increase uptake of tax policy for which there is causal evidence? To answer these questions, we randomize policymakers in econometrics training and trace its impact on their attitudes, policy and the population.
External Link(s)

Registration Citation

Chen, Daniel , Sultan Mehmood and Shaheen Naseer. 2022. "Training Policymakers in Econometrics: Evidence from Two Experiments in Pakistan ." AEA RCT Registry. December 13.
Experimental Details


Experiment 1 with Junior Ministers. — The first experiment involves an intensive training where we aim to maximize the comprehension, retention, and utilization of the educational materials. Namely, we augmented the book receipt with lectures from the books’ authors, namely, Joshua Angrist and Daniel Siegel, along with competitive writing assignments. As part of the training program, deputy ministers were assigned to write two essays. The first essay was to summarize every chapter of their assigned book, while the second essay involved discussing how the materials would apply to their career. The junior ministers in each treatment group also participated in a zoom session to present, discuss the lessons and applications of their assigned book in a structured discussion.
Experiment 2 with Tax Officers. — The second experiment involved tax officers randomized into econometrics versus a placebo training. Identical to our first experiment, the officers receive writing assignments, present their learnings in class and engage in a structured discussion.
Difference Between Experiment 1 versus Follow-on Experiment 2. — There are two differences between the first and the second experiment. The first is that the second experiment also involves a cross-randomization of two signals. Half of tax officers randomly receive an email a one-page summary of results from a paper that provides experimental evidence that sending tax reminders increases tax collection, while the remaining half are randomized to receive the handout where a correlational study that tax reminders increase tax collection. Their prior and posterior beliefs on impact of tax reminders on tax collection are also solicited. The second difference between the two experiments is that in the experiment with customs officers, we have a natural policy outcome, on tax reminders sent and tax collection, linked from treated tax officers to their tax district jurisdictions.

Hypotheses.— We will test the following main hypotheses in the study:
H1: Econometrics training impact policy choices on deworming (Experiment 1)
H2: Econometrics training has no impact on policy choices on orphanage and school renovations (Experiment 1)
H3: Econometrics training has impact on sending tax reminder letters (Experiment 2)
H4: Econometrics training has impact on tax collection (Experiment 2)
We will test the following supplementary hypotheses:
H5: Econometrics training will increase more the adoption of sending tax reminder for those officers that received the signal study to send tax reminders with RCT evidence (Experiment 2)
H6: Econometrics training will not impact or decrease the adoption of sending tax reminder for those officers that received the signal study to send tax reminders with correlational evidence (Experiment 2).
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
In the first experiment with deputy ministers, the outcomes on policy decision are for fiscal support or budgetary requests of deputy ministers from the Finance Ministry of Pakistan. We policy choices data for fiscal support of the deputy ministers is available for three policies: one related to our signal of RCT evidence (deworming policy) and two placebo policies (school and orphanage renovations) unrelated to the signal. The funding requests are made roughly a month before the federal budget for the next fiscal year is announced every year. The data on Willingness-to-Pay, attitudes and beliefs were collected by the research team. In the second experiment with tax officers, the main outcomes involve policy adoption to send tax reminder letters and for total tax collected by the tax officer.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
In the first experiment we randomly assign 200 ministers into econometrics and placebo training unrelated to econometrics. In the first experiment, we also obtain demand for econometrics training prior to the experiment and control for this in all regressions.
In the second experiment, we randomly assign 300 tax officers into econometrics and placebo training unrelated to econometrics.

Experimental Design Details
Randomization Method
Randomization by a computer
Randomization Unit
Individual policymaker
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
200 deputy ministers and 300 tax officers.
Sample size: planned number of observations
200 + 300 = 500 policymakers
Sample size (or number of clusters) by treatment arms
250 individuals per treatment arm.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Supporting Documents and Materials


Document Name
Analysis Plan
Document Type
Document Description
Pre-Analysis Plan
Analysis Plan

MD5: adc83fa3f9a6ea68817d3f1763cefd31

SHA1: 5d6aa1226ed2ef9cd66a5cf70b42f303f4f841a9

Uploaded At: December 04, 2022


Institutional Review Boards (IRBs)

IRB Name
Lahore School of Economics Ethical Review Board
IRB Approval Date
IRB Approval Number
Analysis Plan

Analysis Plan Documents


MD5: adc83fa3f9a6ea68817d3f1763cefd31

SHA1: 5d6aa1226ed2ef9cd66a5cf70b42f303f4f841a9

Uploaded At: December 04, 2022


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials