Physician Response to Prices of Other Physicians: Evidence from a Field Experiment

Last registered on September 28, 2021

Pre-Trial

Trial Information

General Information

Title
Physician Response to Prices of Other Physicians: Evidence from a Field Experiment
RCT ID
AEARCTR-0008279
Initial registration date
September 22, 2021

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 28, 2021, 2:20 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Primary Investigator

Affiliation
Clemson University

Other Primary Investigator(s)

Additional Trial Information

Status
Completed
Start date
2013-11-01
End date
2021-07-29
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Recent efforts to increase price transparency for American consumers of health care have largely failed to produce savings. At the same time, little effort has been made towards increasing physician-side price transparency, which may be due to limited high-quality research being available on the topic. I try to address this limitation in the literature by performing a field experiment on physician price transparency for an area that has received little attention: physician referrals. Working with a group of medical practices linked as an Independent Practice Association (IPA), I randomly selected primary care practices to receive a list of prices for new referrals to six ophthalmology practices that were part of the IPA's provider network. These practices handled the bulk of the IPA's ophthalmology patients and represented substitute providers. Using the IPA's administrative data on referrals, I find that during the first two months following the distribution of the price list, the treatment group primary care physicians (PCPs) increased referral share towards the least expensive ophthalmology practice by 147 percent. These referrals were allocated away from the most expensive practice and those not listed on the report. These effects were only found, however, for patients for whom the PCPs had a cost reduction incentive. The large initial effect dissipated over the following four months. For patients with a limited financial interest for the PCPs, I find little evidence of a treatment response. These contrasting results suggest the PCPs were influenced by cost reduction motives and provide evidence of the potential for savings from physician-side price transparency.
External Link(s)

Registration Citation

Citation
Barkowski, Scott. 2021. "Physician Response to Prices of Other Physicians: Evidence from a Field Experiment." AEA RCT Registry. September 28. https://doi.org/10.1257/rct.8279-1.0
Experimental Details

Interventions

Intervention(s)
The experimental treatment for this study was a report listing two numbers for each of six busy ophthalmology practices. The numbers were risk-adjusted, 180-day cost averages for newly referred patients to ophthalmology for both HMO and Medicare Advantage patients. Together with the cost report, the treatment group PCPs also received a cover letter from the CEO of the IPA, briefly explaining the reason for receiving the report and a description of how the costs were calculated.

The cost report was comprised of two parts: the cost table and a set of footnotes with additional information. On the left side of the table were the names of six ophthalmology practices. Below each practice name were the names of the IPA network ophthalmologists associated with each of those practices in smaller, italicized print. On the right were two costs, one for each type of patient, for each practice.
Intervention Start Date
2014-05-05
Intervention End Date
2014-11-15

Primary Outcomes

Primary Outcomes (end points)
My primary outcome is the share of a PCP practice's ophthalmology referrals that an ophthalmology practice receives, measured on a bimonthly basis (two-month periods).
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The subjects of the experiment were all PCPs associated with the IPA who practiced in either the family practice or internal medicine specialties in an outpatient (ambulatory) setting. Subjects assigned to the treatment group all received an informational treatment: a letter containing historical average costs of six ophthalmology practices affiliated with the IPA, the ``treatment,'' ``prices,'' or ``cost report.'' Control group subjects did not receive anything. In order to maximize the experiment sample size, all of the IPA's PCPs who were active at the time of the treatment distribution were included if they satisfied minimal criteria: they must have had at least ten claims during each calendar month from August 2013 through January 2014, and made at least one patient referral to ophthalmology during that period. In the end, a total of 93 PCPs -- 35 internists and 58 family practitioners -- were included in the experiment. These physicians were typically organized into group practices. In total, there were 55 included practices, with 24 of them being internal medicine, 30 family practice, and one mixed specialty group.

To account for the organization of PCPs into practices, assignment into treatment and control groups took place at the practice level, so that either all PCPs in a practice were assigned to the treatment group, or none were.

During the experiment, the IPA collected data on PCP referrals as part of its normal operations. This data was generated by the PCPs' activities of seeing and treating patients as part of their usual medical practices in their regular offices. The IPA regularly collects all of this data, and all of the physicians are aware of this data collection. Moreover, none of the physicians were made aware that the distribution of the price information was related to an experiment, thereby minimizing any possible influence of the ``Hawthorne effect.''
Experimental Design Details
Randomization Method
Randomization by computer.
Randomization Unit
Medical primary care practice. While there are multiple individuals in each practice, since I measure the outcome at the practice level the practice is considered the formal subject. So the study is not cluster randomized in that sense.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
55 PCP practices
Sample size: planned number of observations
55 PCP practices times 6 ophthalmology practices by 6 periods = 1980
Sample size (or number of clusters) by treatment arms
28 treatment, 27 control
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
November 15, 2014, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
Final Sample Size: Number of Clusters (Unit of Randomization)
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials

Description
Study working paper / report
Citation
Barkowski, Scott, Physician Response to Prices of Other Physicians: Evidence from a Field Experiment (July 29, 2021). Available at SSRN: https://ssrn.com/abstract=3895972 or http://dx.doi.org/10.2139/ssrn.3895972