Adaptive Treatment Assignment for Policy Choice: Phone Enrollment for an Agricultural Extension Service

Last registered on May 06, 2021


Trial Information

General Information

Adaptive Treatment Assignment for Policy Choice: Phone Enrollment for an Agricultural Extension Service
Initial registration date
June 02, 2019

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
June 04, 2019, 11:28 AM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
May 06, 2021, 4:23 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.



Primary Investigator

World Bank

Other Primary Investigator(s)

PI Affiliation
Oxford University

Additional Trial Information

Start date
End date
Secondary IDs
The purpose of this experiment is to test the implementation of an adaptive experimental procedure designed by the authors in a real-world context.
The goal of many experiments is to inform the choice between different policies. However, standard experimental designs are geared toward point estimation and hypothesis testing. We consider the problem of treatment assignment in an experiment with several cross-sectional waves where the goal is to choose among a set of possible policies (treatments) for large-scale implementation. We show that optimal experimental designs learn from earlier waves by assigning more experimental units to the better-performing treatments in later waves. We propose a computationally tractable approximation of the optimal design, based on a modification of Thompson sampling.
We collaborate with a non-profit organization that supports smallholder farmers in developing countries by providing personalized agricultural advice through mobile phones in India. We apply the proposed adaptive trial design to help the non-profit choose efficiently among several alternative call procedures (treatments) to enroll farmers into their service. Since the adaptive design is based on Bayesian optimization, it allocates more subjects to the most successful treatment arms in consecutive study waves, which improves learning and at the same time applies the more successful treatment variants earlier to more subjects.

External Link(s)

Registration Citation

Kasy, Maximilian and Anja Sautmann. 2021. "Adaptive Treatment Assignment for Policy Choice: Phone Enrollment for an Agricultural Extension Service." AEA RCT Registry. May 06.
Former Citation
Kasy, Maximilian and Anja Sautmann. 2021. "Adaptive Treatment Assignment for Policy Choice: Phone Enrollment for an Agricultural Extension Service." AEA RCT Registry. May 06.
Experimental Details


Precision Agriculture for Development (PAD) is testing interactive voice response (IVR) phone calls with the purpose to enroll as many rice farmers as possible into a farmer information service. The main content of the call consists in a set of enrollment questions and does not vary across treatments.
The experiment consists in varying how these phone calls are initiated. Possible treatment variants considered involved text messages prior to the call to announce the call time, starting the call with a jingle, calling at different times of the day, etc.
The final choice of treatments selected by PAD for testing was:
- sending no text message to announce the call, sending a text message 1 hour before, and sending a text message 24 hours before;
- calling farmers at 10 am, and calling farmers at 7pm.
Text messaging and call times were cross-randomized for a total of 6 treatment arms.
Each study wave takes two days (one day to send text messages, one day to carry out calls).
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Successful call completion: binary variable describing whether the call recipient answered five IVR questions asked during the call (answers to these questions enable processing).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The study is conducted in India, with farmers who are potential clients of Precision Agriculture for Development, an NGO that attempts to improve the lives of farmers by providing tailored farming advice.
Precision Agriculture for Development (PAD) is testing a variety of different interactive voice response (IVR) treatments. The purpose of these treatments is to enroll as many rice farmers as possible into the farmer information service. PAD designed these treatments based on pilots and experience from other enrollment campaigns. The treatments are fully designed and implemented by PAD and their final choice was to test six distinct treatment arms (see above). The adaptive experimental procedure the authors designed helps PAD to choose the treatment that works best for their setting.
The experiment is conduced in waves of 600. The only information PAD has are a very large set of phone numbers provided by an outside party. An employee of PAD processes these numbers by querying if they are valid, and if they are on a do-not-disturb list. Only valid numbers not on the list are then used. PAD is setting aside 10'000 phone numbers that are up for calling in the target period for this experiment.
Experimental waves are carried out consecutively, where each wave takes two days to administer (text messages sent up to 24 hours ahead of time, two call times in morning and evening on the call day). In the first wave, 100 phone numbers are randomly selected from the set of 10'000 numbers and each is assigned randomly to one of the treatment arms. The outcomes of each experimental wave are submitted to an app programmed by the authors. The app uses treatment success rates in each treatment arm to update a flat prior on possible average success rates, and applies the Exploration Sampling algorithm to choose the number of phone numbers to be assigned to each treatment arm in the next wave. A new wave of 600 phone numbers are then randomly selected and assigned to treatment arms. Exploration Sampling is adapted from standard Thompson sampling for waves of size 1 by (a) reducing sampling variation by taking advantage of batch sampling (of 600 units per wave here), and (b) re-distributing experimental units from the best performing option to close competitors to improve learning. The experiment continues until 10'000 phone numbers are used.
Experimental Design Details
Randomization Method
Computer-based randomization.
Randomization Unit
Individual phone number.
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
600 (or less) per wave up to a total of 10'000 numbers.
Sample size: planned number of observations
600 (or less) per wave for a total of 10'000 numbers.
Sample size (or number of clusters) by treatment arms
100 per wave in expectation. Number of units in each arm is chosen as part of the experimental design during the trial. Final sample size approved by IRB may be up to 10'000.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
N/A - samples size is pre-determined and treatments are evaluated based on expected regret.

Institutional Review Boards (IRBs)

IRB Name
Massachusetts Institute of Technology, Committee on the Use of Humans as Experimental Subjects (COUHES)
IRB Approval Date
IRB Approval Number


Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information


Is the intervention completed?
Intervention Completion Date
July 06, 2019, 12:00 +00:00
Data Collection Complete
Data Collection Completion Date
July 06, 2019, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
903 (10 am, no SMS), 3931 (10 am, SMS 1hr ahead), 2234 (10 am, 24 hrs ahead), 366 (6:30 pm, no SMS), 1081 (6:30pm, SMS 1 hr ahead), 1485 (6:30pm, SMS 24hrs ahead)
Reports, Papers & Other Materials

Relevant Paper(s)

Standard experimental designs are geared toward point estimation and hypothesis testing, while bandit algorithms are geared toward in-sample outcomes. Here, we instead consider treatment assignment in an experiment with several waves for choosing the best among a set of possible policies (treatments) at the end of the experiment. We propose a computationally tractable assignment algorithm that we call “exploration sampling,” where assignment probabilities in each wave are an increasing concave function of the posterior probabilities that each treatment is optimal. We prove an asymptotic optimality result for this algorithm and demonstrate improvements in welfare in calibrated simulations over both non-adaptive designs and bandit algorithms. An application to selecting between six different recruitment strategies for an agricultural extension service in India demonstrates practical feasibility.
Kasy, Maximilian and Anja Sautmann, 2021. Adaptive treatment assignment in experiments for policy choice. Econometrica 89(1), 113-132.

Reports & Other Materials