Evaluation of the Electronic Health Records Demonstration

Last registered on February 07, 2014

Pre-Trial

Trial Information

General Information

Title
Evaluation of the Electronic Health Records Demonstration
RCT ID
AEARCTR-0000241
First published
February 06, 2014, 6:00 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
February 07, 2014, 9:10 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Primary Investigator

Affiliation
Mathematica Policy Research

Other Primary Investigator(s)

Additional Trial Information

Status
Abandoned
Start date
2008-06-01
End date
2011-08-01
Secondary IDs
Abstract
The Electronic Health Records Demonstration (EHRD), implemented by the Centers for Medicare & Medicaid Services (CMS), provided financial incentives to physician practices to use a certified EHR. Practices that met minimum EHR use requirements received payments on a graduated scale, increasing for more use of EHR functions.

The demonstration was implemented in four sites, targeting practices with 20 or fewer providers supplying primary care to at least 50 fee-for-service Medicare beneficiaries. The demonstration was expected to operate for five years (June 1, 2009–May 31, 2014), but was cancelled in August 2011 because 43% of the practices did not meet program requirements. The evaluation used a stratified, experimental design—412 treatment and 413 control practices—to estimate the impacts of the payments on adoption and use of EHR functionalities.

In June 2011, treatment group practices were, on average, 9 to 18 percentage points more likely than control group practices to report using 13 EHR functionalities queried at baseline (2008). The payments increased a summary score of EHR use, which ranged from 1 to 100, by more than 11 points on average relative to the control group (54 versus 43).

Moderate incentive payments did not lead to universal EHR adoption and use in a two-year time frame. However, the demonstration showed that incentives can influence physician use of EHRs. These results are encouraging regarding the potential effectiveness of the EHR Medicare Incentive Program but also suggest that meaningful use of EHRs on a national scale may take longer than anticipated.
External Link(s)

Registration Citation

Citation
Moreno, Lorenzo. 2014. "Evaluation of the Electronic Health Records Demonstration." AEA RCT Registry. February 07. https://doi.org/10.1257/rct.241-2.0
Former Citation
Moreno, Lorenzo. 2014. "Evaluation of the Electronic Health Records Demonstration." AEA RCT Registry. February 07. https://www.socialscienceregistry.org/trials/241/history/1035
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
The demonstration targeted practices serving at least 50 traditional Fee-For-Service Medicare beneficiaries with certain chronic conditions for whom the practices provided primary care. Under the original design, primary care providers (physicians, and nurse practitioners and physician assistants that provide primary care) in practices with 20 or fewer providers were eligible to earn incentive payments for (1) using at least the minimum functions of a certified electronic health records systems (a systems payment, with increasing rewards for increasing use); (2) reporting 26 quality measures for congestive heart failure, coronary artery disease, diabetes, and preventive health services (a reporting payment); and (3) achieving specified standards on clinical performance measures during the demonstration period (a performance payment, with increasing rewards for better adherence to recommended care guidelines). All incentive payments under the demonstration were to be made in addition to the FFS Medicare payments practices receive for submitted claims. Physicians could have received up to $13,000 and practices up to $65,000 over the first two years of the demonstration. Due to the termination of the demonstration, the reporting and performance payments were never made; Centers for Medicare & Medicaid Services made only the systems payment for the first two years of the demonstration in fall 2010 and fall 2011, which totaled $4.5 million.
Intervention Start Date
2009-06-01
Intervention End Date
2011-08-01

Primary Outcomes

Primary Outcomes (end points)
Key measures for the evaluation were practices’ adoption and use of EHRs and other health IT, and a summary (composite) score that quantifies EHR use for the calculation of the incentive payment, which were drawn from a web-based Office Systems Survey (OSS). Specific outcomes were:

- Any EHR/Health IT Use
- Electronic Patient Problem Lists
- Automated Patient-Specific Alterts and Reminders
- Electronic Disease-Specific Patient Registries
- Patients' Email
- Patient-Specific Educational Materials
- Online Referrals to Other Providers
- Laboratory Tests:
* Online order entry
* Online results viewing
- Radiology Tests:
* Online order entry
* Online results viewing (reports and/or digintal films)
- E-Prescribing:
* Printing and/or faxing Rx
* Online Rx transmission to pharmacy

- Overall OSS score
- OSS Score Domains:
* Completeness of information in the EHR
* Communication of care outside the practice
* Clinical decision support
* Increasing patient engagement
* Medication safety
Primary Outcomes (explanation)
In order to calculate EHR summary scores for practices that used a certified EHR, the Office Systems Survey (OSS) measured 53 functions (for example, prescribing medications, ordering laboratory tests and other procedures, and care management and coordination) that are thought to be connected to improved care, although a causative link is not yet empirically proven for many. These functions were sorted into five domains: completeness of information, communication about care outside the practice, clinical decision support, increasing patient engagement, and medication safety. If practices were to use all 53 functions for three-fourths or more of their patients, the total composite score would equal 100. In addition to calculating this score, composite scores were calculated for the five OSS domains. (Baseline scores cannot be estimated because application data on EHR/other health IT use are only available for 13 of the 53 functions.) Based on the total composite score for each treatment practice, CMS calculated payments during each demonstration year. Practices received their payments in the fall following the end of each demonstration year.

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The EHRD evaluation used a stratified, experimental design to allocate 825 eligible practices that volunteered for the EHRD to treatment and control groups. This design was used to achieve balance on practice characteristics that are important predictors of adoption and use of electronic health records systems. In February 2009, practices from the four sites (Louisiana, Maryland/DC, Pennsylvania, and South Dakota) were randomized in equal proportions into treatment and control practices within strata, defined by site, number of primary care physicians, and whether the practice was in a medically underserved area.
Experimental Design Details
All randomized practices were included in an intent-to-treat analysis. Using data from all practices that completed the 2011 Office Systems Survey (OSS), treatment-control differences in any EHR/health IT use and use of each of the 13 EHR functions were estimated using separate regressions. A similar analysis was conducted for the overall OSS summary score and the five OSS domain scores. The regressions adjusted for the stratifying variables and the baseline measure of the 13 functions. Inclusion of these variables adjusts for any differences between treatment and control groups due to survey non-response. Observations were weighted to adjust for survey nonresponse and non-random demonstration attrition. Sensitivity tests were conducted to confirm that the results were similar in regressions that did not use baseline control variables and in regressions that did not use weights. Analyses were conducted with the use of STATA.
Randomization Method
Randomization was done in office by a computer with the use of the routine ralloc for STATA, written by Phillip Ryan. The code is available at:
http://econpapers.repec.org/software/bocbocode/s319901.htm
Randomization Unit
Physician practices
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
825 practices
Sample size: planned number of observations
52,287 Medicare beneficiaries in the first year of the demonstraiton (for the analysis of the impact of the intervention on Medicare service use and expenditures))
Sample size (or number of clusters) by treatment arms
412 practices receive financial incentives, 413 practices control
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
For a continuous composite score that measures adoption and use of electronic health records systems (see Outcomes section above), the MDE was 9.9 percent of the control group mean. The MDE assumes a response rate of 85 percent to the Office Systems Survey; an intracluster correlation coefficient of 0.037; a coefficient of variation of 2.2; a significance level of 0.10 for a two-sided test; and 80 percent power; and a regression R-squared of 0.3. For binary variables, the MDE was 10 percentage points with a mean of 0.5 also assuming a regression R-squared of 0.3. For both outcomes, the sample size was 267 control group and 314 intervention respondents. The impacts of the demonstration on practices's use of health IT were generally much larger than the MDEs, suggesting the evaluation was well powered for these outcome measures.
IRB

Institutional Review Boards (IRBs)

IRB Name
IRB Approval Date
IRB Approval Number

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
August 01, 2011, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
May 31, 2011, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
754 practices
Was attrition correlated with treatment status?
Yes
Final Sample Size: Total Number of Observations
800,524 (Total for the demonstration; for the analysis of the impact of the intervention on Medicare service use and expenditures)
Final Sample Size (or Number of Clusters) by Treatment Arms
362 practices receive finanical incentives, 392 practices control
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
No
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
Background
The Electronic Health Records (EHR) Demonstration, implemented by the Centers for Medicare & Medicaid Services (CMS), provided financial incentives to physician practices to use a certified EHR. Practices that met minimum EHR use requirements
received payments on a graduated scale, increasing for more use of EHR functions.

Methods
The demonstration was implemented in four sites, targeting practices with 20 or fewer providers supplying primary care to at least 50 fee-for-service Medicare beneficiaries. The demonstration was expected to operate for five years (June 1, 2009–May 31, 2014), but was cancelled in August 2011 because 43% of the practices did not meet program requirements. The evaluation used a stratified, experimental design—412 treatment and 413 control practices—to estimate the impacts of the payments on adoption and use of EHR functionalities.

Results
In June 2011, treatment group practices were, on average, 9 to 18 percentage points more likely than control group practices to report using 13 EHR functionalities queried at baseline (2008). The payments increased a summary score of EHR use, which ranged from 1 to 100, by more than 11 points on average relative to the control group (54 versus 43).

Conclusion
Moderate incentive payments did not lead to universal EHR adoption and use in a two-year time frame. However, the demonstration showed that incentives can influence physician use of EHRs. These results are encouraging regarding the potential effectiveness of the EHR Medicare Incentive Program but also suggest that meaningful use of EHRs on a national scale may take longer than anticipated.
Citation
Do Financial Incentives Increase the Use of Electronic Health Records? Findings from an Experiment

Reports & Other Materials