The Impact of Job-Specific Training on Short-Term Worker Performance: Evidence from a Field Experiment

Last registered on April 17, 2017

Pre-Trial

Trial Information

General Information

Title
The Impact of Job-Specific Training on Short-Term Worker Performance: Evidence from a Field Experiment
RCT ID
AEARCTR-0001515
Initial registration date
August 25, 2016

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
August 25, 2016, 2:08 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
April 17, 2017, 10:46 PM EDT

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
UC San Diego

Other Primary Investigator(s)

Additional Trial Information

Status
Completed
Start date
2015-01-01
End date
2016-07-01
Secondary IDs
Abstract
Remote and short-term work arrangements are increasingly common despite the limited incentives they provide for acquiring firm-specific knowledge. This study examines the importance and cost-effectiveness of firm-specific training for remote contract workers using a field experiment run among short-term, outside sales people employed by an insurance firm in Northern Kenya. In particular, I test whether whether giving these workers the option to invest in firm-specific training through a mobile training application affects their performance, retention, and subsequent firm earnings, and how this varies with monetary and competition-based incentives to invest in the training. Preliminary findings demonstrate that having access to firm-specific training significantly increases worker performance. These results suggest that the provision of low cost, firm-specific training to short-term workers has the potential to significantly increase firm performance and that workers facing pay-for-performance incentives are frequently willing to invest in this training with little to no added incentives to do so. This study has implications for our understanding of short-term and remote worker management, and for our understanding of the development of the contract labor market and the on-demand economy more generally.
External Link(s)

Registration Citation

Citation
Lyons, Elizabeth. 2017. "The Impact of Job-Specific Training on Short-Term Worker Performance: Evidence from a Field Experiment." AEA RCT Registry. April 17. https://doi.org/10.1257/rct.1515-3.0
Former Citation
Lyons, Elizabeth. 2017. "The Impact of Job-Specific Training on Short-Term Worker Performance: Evidence from a Field Experiment." AEA RCT Registry. April 17. https://www.socialscienceregistry.org/trials/1515/history/16646
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2015-08-01
Intervention End Date
2015-10-01

Primary Outcomes

Primary Outcomes (end points)
Total insurance premiums collected in the intervention period, short-term worker retention in the post-intervention period
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
In order to provide causal evidence on whether offering firm-specific training to short-term remote workers, I ran a field experiment in which I randomly assigned the option to invest in firm-specific training through a mobile training application to insurance sales agents in Northern Kenya. This design allows me to control for selection into training while still allowing me to test whether workers are willing to invest in it. Below I describe my research setting, the population of agents in my study, the experimental treatment groups, and the provision of training to agents in the treatment groups.

Research Setting:

I ran my experiment in Northern Kenya in collaboration with Index Based Livestock Insurance (IBLI), a project group that operates out of the International Livestock Research Institute (ILRI) in Nairobi, Kenya, and Takaful Insurance of Africa (TIA), the largest Sharia compliant insurance company in East and Central Africa. TIA distributes and underwrites IBLI in Northern Kenya.

IBLI is an index-based insurance product that covers pastoralists in arid and semi arid regions against risk of drought related livestock losses. TIA sells IBLI to Kenyan pastoralists using an outside sales-agent model in which agents are selected in part based on their literacy, and their relationship to their local community. They are tasked with selling IBLI within their communities for a two month sales windows. Sales windows occur twice a year (from the beginning of January to the end of February, and from the beginning of August to the end of September), and there is substantial agent turnover between windows. For instance, over than 50% of agents employed for the first sales window in 2015 did not renew their contract for the second sales window. In the sales window covered by the experiment, TIA was providing IBLI in 6 counties in Northern Kenya; Garissa, Isiolo, Mandera, Marsabit, Moyale, and Wajir. Each county is divided into divisions (ranging from 4 to 16 per county), and each division is assigned one lead agent and several sub-agents.

The pastoralists who are the targets of IBLI are largely unfamiliar with and leery of the product, and with insurance in general. Over ten times more IBLI sales agents surveyed during the January-February, 2015 sales window reported that poor pastoralist education about insurance was a bigger barrier to their sales than was the cost of the insurance product. IBLI, like all insurance products, is complex, and mis-informing customers about what to expect from the coverage is illegal and can lead to further mistrust of the product within communities. Unlike many other insurance products, IBLI provided by TIA is Sharia compliant and pay-outs are based on an index tied to the Normalized Differential Vegetation Index (NDVI). Therefore, knowledge about IBLI and about TIA as a provider of Sharia compliant insurance is critical for agents to develop and maintain sales.

Study Population

All lead and sub-agents employed by TIA to sell IBLI during the August-September, 2015 sales window are included in my sample. Lead agents are tasked with finding and training sub-agents in addition to selling the product. To help them achieve this, lead agents a three day in person training camp where they learn about IBLI, TIA, and selling in general. Sub-agents are small business owners, for instance, retail store owners or M-PESA agents, who are recruited because they have frequent contact with potential customers. Although they have sales experience as store owners, anecdotal evidence from conversations with TIA employees and lead agents suggests that they are frequently semi-literate or illiterate and have primary educations on average. Lead agents are more educated than sub-agents, on average they have some post-secondary education, but like sub-agents, they frequently have additional income earning activities and do not consider selling IBLI to be their career.

Agents are assigned to sell in the division they live in, however, they are free to sell in other divisions as well. TIA employs local workers first, because they speak the local dialects and are more likely to be trusted by the pastoralists in their community, and second, because it is difficult to recruit non-local workers to relocate to these relatively under-developed and remote regions.

Experimental Treatments:

To test short-term workers are willing to invest in firm-specific knowledge, and how the provision of this knowledge affects firm performance, I randomly assigned sales agents into one of three treatment groups or a control group. Agents in all three treatment groups receive access to a mobile training application that provides training modules, quizzes, and frequently asked questions on TIA and the insurance product they are selling. One treatment group was sent a version of the training application that offers agents the opportunity to be awarded phone credit for good performance on the module quizzes. Going forward, I refer to this as the incentive group. The second treatment group was sent a version of the training application that includes game features(specifically, a leader board and badges for participation and good performance on the module quizzes). Going forward, I refer to this as the competition group. The third treatment group does not receive any additional content on their training application. Going forward, I refer to this as the basic group. Agents in the control group did not receive access to the training application.

Agents in the incentive group were awarded KSH 250 in phone credit for each quiz for which they got at most one question wrong. Although agents can take each quiz as many times as they like, they can only be awarded phone credit once per quiz, and only for their first attempt at the quiz. Therefore, these agents could earn up to KSH 1750 from the application (about 17 USD at the time of the study). Agents in the competition group had access to a leader board on their applications that showed them how they ranked relative to all other agents (including those not in the competition group). Only agent numbers were listed on the leader board so as to protect agents' privacy. Agents in this group could also receive digital badges for completing each quiz and separately for performing well on each quiz (i.e. getting at most one question wrong).

To avoid agents in the control knowing about the training application or gaining access to the content, and to avoid agents in one treatment group learning about other versions of the application and potentially becoming upset by the differences or becoming aware of a potential study, randomization occurred at the division level. Agents were not aware of the study, nor were they aware that there were different versions of the application or that some agents did not receive access to the application.

Training Provision:

Training was provided to agents in the treatment groups through an Android application developed by contracted computer programmers in a collaboration with TIA and IBLI. TIA began providing all agents with Android smart phones in the January-February, 2015 sales season to allow agents to record all their sales on a sales transaction application developed for IBLI sales agents. This roll-out confirmed the willingness and ability of agents to use smart phone applications.

The training application was developed to provide simple, direct, and concise statements about the insurance product, and TIA. These statements are each under one of 7 training modules. At the conclusion of each training module, agents can complete a quiz on the content covered in the corresponding module. The application also includes a frequently asked questions section that agents can refer to when they are selling. This section includes 17 questions and answers which are meant to address confusion agents in previous sales windows had when selling the product.

Training applications were provided to agents in the treatment groups through SMS five days before the beginning of the August-September, 2015 sales window. SMS messages included download instructions, a link to the application, and depending on which treatment group agents were in, a message about the application. In the basic treatment group, agents were told they could use this application to help them better understand IBLI and potentially improve their sales. In the incentive treatment group, agents were told the same thing but were also told they could earn phone credit by performing well on the quizzes in the training application. In the competition treatment group, agents were told what agents in the basic group were told and were also told that they could see how their performance on the training application quizzes compares with other agents, and receive badges on the application for participating in the training and performing well on the quizzes. Over the next four days, each agent in the treatment groups was contacted by phone by a researcher at IBLI and/or by TIA county coordinators who are permanent employees to determine whether they had received the app and been able to download it. In some cases, because of poor network in some areas, efforts to reach agents by phone continued after the sales window had begun.

Experimental Design Details
Randomization Method
Coin flip
Randomization Unit
weather division (sub-regions within counties whether spillovers between agents are expected)
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
54 weather divisions
Sample size: planned number of observations
296 sales agents
Sample size (or number of clusters) by treatment arms
85 sales agents control, 66 sales agents basic training, 69 sales agents training with monetary incentive, 76 sales agents training with competition features
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Based on previous season's sales and expected benefits from training: 48 agents in control and treatment to obtain statistically significant differences in premiums collected by insurance agents in treatment and control at the 5% level with a power of 80%.
IRB

Institutional Review Boards (IRBs)

IRB Name
UC San Diego Human Research Protections Program
IRB Approval Date
2015-07-14
IRB Approval Number
150791

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
October 01, 2015, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
July 01, 2016, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
54 weather divisions
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
296 sales agents
Final Sample Size (or Number of Clusters) by Treatment Arms
85 sales agents control, 66 sales agents basic training, 69 sales agents training with monetary incentive, 76 sales agents training with competition features
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials