The Impact of Personalized Telephone Outreach on Health Insurance Choices

Last registered on October 19, 2020


Trial Information

General Information

The Impact of Personalized Telephone Outreach on Health Insurance Choices
Initial registration date
September 29, 2020
Last updated
October 19, 2020, 5:17 PM EDT


Primary Investigator

University of Wisconsin-Madison

Other Primary Investigator(s)

PI Affiliation
Covered California
PI Affiliation
Covered California
PI Affiliation
PI Affiliation
University of Illinois at Chicago

Additional Trial Information

Start date
End date
Secondary IDs
Many Americans have a limited understanding or information about health insurance. Limited awareness of plan options, the availability of subsidies, and plan costs make enrolling and choosing plans difficult. The goal of the project is to assess whether a personalized information intervention can help health insurance marketplaces improve take-up and choice quality, and improve market stability. Specifically, the project will shed light on the following questions:
I. What is the effect of personalized phone outreach on enrollment (take-up) and the risk mix of consumers in the marketplace?
II. How do personalized phone calls affect consumers’ decision of which plan to choose?
III. Are personalized outbound calls particularly effective with certain hard-to-reach populations?
IV. Phone calls are resource intensive; what are the financial implications for the Marketplace and for the insurance market generally of a phone-based intervention to encourage take-up?
External Link(s)

Registration Citation

Feher, Andrew et al. 2020. "The Impact of Personalized Telephone Outreach on Health Insurance Choices." AEA RCT Registry. October 19.
Experimental Details


To understand the effect of a personalized information intervention on health insurance enrollment and plan choice, Covered California implemented telephone consultations during the open enrollment period for 2019 coverage to help inform consumers of their plan options, costs, plan benefits, and assist in enrollment. In this intervention, eligible consumers who had not yet picked a plan by a date close to the deadline were randomly selected to receive no phone call, or a call with a Service Center Representative (SCR) who would discuss their health insurance options with them. In contrast to low-touch interventions, which passively inform consumers about plan options, this intervention assisted consumers more intensively and interactively.
Because call-center time is a scarce resource during the open enrollment period, and it was known in advance that not all eligible consumers could be reached, Covered California implemented the outbound call campaign with a randomized control group. We propose to use administrative data on health insurance choices and phone calls from this randomized controlled trial to understand how personalized assistance informs take-up, plan selections, and costs. The data gathered from the trial include consumers’ random treatment assignment, personal characteristics, and subsequent health insurance decisions.
Intervention Start Date
Intervention End Date

Primary Outcomes

Primary Outcomes (end points)
Selecting a plan; effectuated enrollment
Primary Outcomes (explanation)
The primary outcome of interest is an indicator for whether a consumer selected a plan during Open Enrollment. A related outcome will be an indicator for effectuated enrollment, i.e., whether the consumer paid for their coverage.

Secondary Outcomes

Secondary Outcomes (end points)
Choice quality; the risk mix of enrollees; engagement with the intervention
Secondary Outcomes (explanation)
Choice quality: A growing literature has documented that many consumers in the United States may purchase the “wrong” plan for them based on the options available (so-called “dominated plans,” i.e., plans that cost more but provide equivalent or worse coverage than other available options) due to their lack of understanding of health insurance options. Consumers facing challenges in plan choice may be more likely to choose such plans. It is plausible that an intensive interactive information intervention could help consumers weigh their options more appropriately and steer away from these choices, with direct implications for consumer expenditure risk and well-being.
Our key outcome related to choice quality will be an indicator for whether a consumer made a choice error and selecting a dominated plan. We considered a given consumer to have selected a dominated plan under any of the following conditions: a) the consumer was eligible for a Silver 87 plan but chose a Gold plan; b) the consumer was eligible for a Silver 94 plan but chose a Gold or Platinum plan; c) the consumer was eligible for a Silver 94, $1-premium plan but chose a $1-premium Bronze plan; d) the consumer had household income greater than 200% of FPL and chose a Silver plan when eligible for a lower-premium Gold plan offered by the same insurer.

The risk mix of enrollees: We will examine the effect of the intervention on risk levels of consumers who purchased a plan in the marketplace, using the CDPS risk scores to capture consumers’ recent health spending. In particular, we will study two measures of risk mix in the marketplace: a) average market risk, driven by the health risk of consumers brought into the market by the intervention; and b) the sorting of consumers, by risk, across plans with higher vs. lower actuarial value (e.g., Silver vs. Gold tier plans), as well as sorting by issuer. The former captures the effect of the intervention on total market risk, while the latter can impact relative profits of plans across tiers and issuers.
It is possible that the intervention, like a low-touch letter intervention, encouraged healthier people to enroll in a plan. Alternately, the more intensive nature of the intervention may have lowered the barriers in making difficult plan choice decisions, which sicker patients find more valuable. By jointly examining the enrollment and plan choice decisions, we will be able to assess which frictions in the enrollment process the intervention is likely targeting and assess its implications for risk selection and sorting across plans.

Engagement with the intervention: If the outbound call intervention has a larger impact than low-touch interventions, such as reminder letters, in-depth conversations with the SCR could account for this difference. Yet, not all consumers who received an outbound call ultimately engaged in such a conversation. Some consumers who received an outbound call hung up briefly after taking the call; others let the call from the SCR go to voicemail and did not return the call.
To better understand this issue, we will examine the factors predicting consumers’ engagement with the intervention, measured by having a conversation with an SCR. Differing levels of engagement with the intervention can help to explain heterogeneous effects of the intervention across different groups of consumers, and inform future targeting of the intervention. The outcome of interest for this analysis will be whether a consumer has ever engaged with an SCR in a conversation lasting 1 minute or longer; other cutoffs such as 30 seconds, 2 minutes, or the median or mean call length will be used in robustness checks.

Experimental Design

Experimental Design
This is a two-arm research design where approximately 30 percent of cases were assigned to a control group and the 70 percent of cases were assigned to the treatment group. A description of these two arms follows:

Arm 1. Treatment arm (N1=55,519): These consumers were placed in a list to receive a call from an SCR. The goal of the call was to provide information about their likely eligibility for coverage through Covered California, provide personalized information about plan options, and provide live assistance in choosing a plan including answering on-the-spot questions. If the call went to voicemail, the SCR left a voicemail instructing the recipient to call the service center hotline if they would like further assistance. In total, 39,309 members of the treatment group had their file reviewed to make sure they were still eligible. Those who were eligible received an outbound call from a SCR if staff were available. In the end, 27,123 received an outbound call and 28,396 did not receive a call before the end of open enrollment.

Arm 2. Control arm (N2=24,003): No phone call was placed to these consumers.
Experimental Design Details
Randomization Method
Randomization occurred at the household level and was based on the last digit of the individual case ID. Cases that end in 1, 2, or 3 were assigned to the control group, and cases that end in any other number were assigned to the treatment group. Preliminary balance checks support the validity of randomization, and further checks are proposed as part of the analysis plan.
Randomization Unit
Was the treatment clustered?

Experiment Characteristics

Sample size: planned number of clusters
Sample size: planned number of observations
During the 2019 Open Enrollment period, Covered California identified 79,522 individuals with initiated paperwork to obtain Covered California health insurance coverage for the 2019 coverage year who had yet to select a plan and enroll. This group comprises our study sample.
Sample size (or number of clusters) by treatment arms
Treatment arm (N1=55,519): These households were placed in a list to receive a call from an SCR. Those who were eligible received an outbound call from a SCR if staff were available. In the end, 27,123 received an outbound call and 28,396 did not receive a call before the end of open enrollment.

Arm 2. Control arm (N2=24,003): No phone call was placed to these consumers.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
To arrive at an estimate for the minimum detectable effect, we assume a baseline plan selection rate of 10 percent based on prior data (Domurat, Menashe, and Yin 2019). With that assumed base rate and 79,500 cases allocated to treatment and control arms in a 70-30 split, the analysis is powered at the 80% level to detect a 0.7 percentage point increase in plan selection rates.

Institutional Review Boards (IRBs)

IRB Name
California Health and Human Services Agency: Office of Human Research Protections
IRB Approval Date
IRB Approval Number
Analysis Plan

Analysis Plan Documents


MD5: d271d1caac4641658fe136f14b2eef2f

SHA1: 7601baf3c7903aab3ad62ae9d270ba2e952cff96

Uploaded At: September 24, 2020


Post Trial Information

Study Withdrawal

There are documents in this trial unavailable to the public. Use the button below to request access to this information.

Request Information


Is the intervention completed?
Data Collection Complete
Data Publication

Data Publication

Is public data available?

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials