Beyond a Thousand Clicks

On March 18th, Kaiser Health News (KHN) and Fortune published https://khn.org/news/death-by-a-thousand-clicks/, a deep expose into all that has gone wrong with the adoption of Electronic Health Records (EHR) across the physician landscape in the U.S. The article was well researched – it is a good piece of journalism.

But are the findings in this article all that surprising?

Frankly, no.

Prior to the signing of the HITECH Act in 2007, which incentivized physicians to adopt EHRs, EHR adoption nationwide stood at a paltry 12-14% for hospitals and about half that for physician practices. Within a decade, however, EHR adoption soared to over 95% for hospitals. That is a massive uptake of a technology that most healthcare organizations were loathed to adopt in the first place.

What could go wrong? Plenty.

As the KHN article points out, we are still working our way through the muck of what became a false market created by the HITECH Act for EHR software adoption and use. The market became flooded overnight with EHR solutions. Despite the government’s best efforts to certify that these solutions met certain criteria – a lot of questionable software ended up in the market.

Yet, even some of the most popular, widely adopted EHR software deployed across the industry is not meeting the needs of end users. In two separate conversations I have had with physicians on my care team, one likened the EHR his large academic medical center is using to just a glorified billing system that does nothing for him in actually caring for his patients. The second physician turned to me after painfully clicking over and over again to make an appointment stated: “If anything is going to make me lose my love of medicine, it is this EHR software.” And note, this was a freshly installed EHR replacing this health system’s internally developed EHR.

Deploying a complex enterprise software solution, be it EHR, supply chain, enterprise resource planning (ERP), etc. is an extremely challenging endeavor regardless of industry sector. I watched first-hand as some of the largest manufacturers in the world did massive rollouts of SAP, Oracle and other systems and it was never, ever easy. Cost overruns, delays and extremely dissatisfied end users were par for the course. And this was an industry sector that willingly made the choice to invest and adopt such systems.

So yes, there is plenty to write about on how the digitization of healthcare has delivered results far, far less than first imagined when HITECH was passed. We still have a long road ahead in optimizing the technology for end users – a technology that was so hastily adopted in a market not fully sold on its intrinsic value. But that is a relatively easy story to tell, albeit limited in vision and scope

What tends to get lost in these discussions is that what we are doing today is laying the foundation for an entirely new change in how we will deliver care, change that will occur along the entire care delivery chain.

Keeping an Eye on the Prize

However, what tends to get lost in these discussions is that what we are doing today is laying the foundation for an entirely new, dare I say radical, change in how we will deliver care, change that will occur along the entire care delivery chain.

The field of medicine has never had such vast troves of computational medical data available to it as we are beginning to see today. The potential opportunity to do deep analysis on such data will open completely new discoveries on everything from the efficacy of therapeutics and clinical pathways, to advances in artificial intelligence for more accurate diagnosis, to discovering new treatment modalities to other advances that are only limited by one’s imagination. One need only look at some of the work Dr. Atul Butte is doing at UCSF to have an appreciation for what we are beginning to unravel today and what the future may hold.

This deeper, broader view of what the adoption and use of EHRs and other enterprise software to support the delivery of care – including new analytics solutions such as those profiled in our recent report – is truly the prize, as a society we are after. Let’s keep our eye on the prize.

Stay up to the minute.

Did You Know?

Promoting Interoperability: MU Fades to Black

By Brian Murphy and Brian Eastwood

Seeking to liberate the industry from its self-created morass of siloed data and duplicative quality reporting programs, the Department of Health and Human Services (HHS) issued 1,883 pages of proposed changes to Medicare and Medicaid. It renamed the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs (known by all as Meaningful Use) to Promoting Interoperability Programs (PI).

As widely reported, it would eliminate some measures that acute care hospitals must report and remove redundant measures across the five hospital quality and value-based purchasing programs. It would also reduce the reporting period to 90 days. HHS will be taking comments until June 25, 2018.

HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place.

Certified EHRs as Enablers of Interoperability

HHS believes that requiring hospitals to use 2015 Edition CEHRT in 2019 makes sense because such a large proportion of the hospitals are “ready to use” the 2015 Edition. Ready to use is not the same as using. 2015 Edition EHRs may not be as widely deployed as HHS indicates. The following 10 month old snapshot from ONC shows hospitals have not aggressively moved to adopt 2015 Edition CEHRT.

Figure 1: Adoption Levels of 2015 CEHRT
Source: Office of the National Coordinator for Health Information Technology. ‘Certified Health IT Developers and Editions Reported by Hospitals Participating in the Medicare EHR Incentive Program,’ Health IT Quick-Stat #29. Available at https://dashboard.healthit.gov/quickstats/pages/FIG-Vendors-of-EHRs-to-Participating-Hospitals.php.

Current adoption levels by HCOs are undoubtedly better, and many vendors have 2015 Edition technology ready to go, but hospitals can only change so fast. The rush to get hospitals on the most current edition has to do with the most relevant difference between the 2014 and 2015 Editions – the API requirement. APIs will be the technical centerpiece of better, more modern interoperability but adoptions levels are still low. APIs, by themselves, offer the promise of better data liquidity. For this promise to become a reality, healthcare stakeholders need more than just a solid set of APIs.

Price Transparency: Easier Said Than Done

HHS is also proposing that hospitals post standard charges and to update that list annually.

This is a nice thought, but it will take some heavy lifting to pull this off. For starters, HHS doesn’t even have a definition of “standard charge” and is seeking stakeholder input before the final rule is published. HHS also must determine how to display standard charges to patients, how much detail about out-of-pocket costs to include (for patients covered by public and private insurance), and what noncompliance penalties are appropriate.

Above all, there’s the thorny issue of establishing what a standard charge is in the first place. Charges vary by payer. Can a hospital truly state, without a doubt, the cost of an MRI or a colonoscopy? Most cannot – and technology alone will hardly solve this problem.

Patients (Not) Using Their Data

The existence of APIs will stand in the stead of the old view/download/transmit (VDT) requirement. Regarded as one of meaningful use’s most troublesome and fruitless requirements, this rule has been shed by HHS because of “ongoing concern with measures which require patient action for successful attestation.”

VDT is one of several MU Stage 3 requirements pertaining to patient engagement – along with providing secure messaging or patient-specific educational resources – that HHS has proposed dropping, under the pretense that it is “burdensome” to healthcare providers. While hospitals have struggled to get many patients to participate, the VDT requirement set the bar at one patient out of an entire population. What’s more, dropping the requirements fails to take into account how burdensome it is for patients to try to access their data, communicate with their physicians, and learn about their conditions and treatment options. It is also contrary to CMS Administrator Seema Verma’s remarks, first at HIMSS18 and again this week, indicating that the agency seeks to “put patients first.”

HHS says that third-party developed apps that use APIs will deliver “more flexibility and smoother workflow from various systems than what is often found in many current patient portals.” Whether such apps deliver “smoother workflow” is not a foregone conclusion.

Reporting Burden Reduction

HHS proposes “a new scoring methodology that reduces burden and provides greater flexibility to hospitals while focusing on increased interoperability and patient access.” The proposed scoring methodology uses a 100-point system (explained over 24 pages) in which attaining a score of at least 50 means there will be no Medicare (or Medicaid) payment reduction.

Table 1: Proposed Scoring Methodology for Promoting Interoperability Program (PI)

HHS is also mulling whether to abandon these measures altogether in favor of scores calculated at the objective level.

The TEFCA Angle

The biggest regulatory effort in recent months related to interoperability, other than this proposal, has been ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA), required under the 21st Century Cures Act. TEFCA, well along in the planning stages, is a new set regulations from ONC whose goal is to catalyze better data availability using APIs. HHS in this regulation wants public comment on whether participation in a TEFCA-compliant network should replace the process measures in Health Information Exchange objective. Stated another way: Should TEFCA compliance replace 80 percent of the score for PI (75 percent in 2020)?

TEFCA is widely expected to provide a safe harbor from data blocking liability although ONC has been mum on this point. TEFCA then could do double duty: Eliminate the need to meet or report on health information exchange metrics and provide a shield from data blocking enforcement.

But there are, as yet, unanswered questions about TEFCA:

  1. How much will it cost providers to comply and can they make money for providing access to their data?
  2. Will TEFCA compliance, as a practical matter, accomplish anything? Will it make it easier for healthcare stakeholders to use each other’s data?

HHS is also considering doing away with Public Health and Clinical Data Exchange objective. It floated the idea that a provider that supports FHIR APIs for population-level data would not need to report on any of the measures under this objective. This would replace 90 percent of the score for PI (85 percent in 2020) when combined with the TEFCA knockout.

The specific API mentioned, called Flat FHIR and still in development, will probably contribute to part of the complex process of public health and registry reporting. This activity currently requires highly skilled data hunter-gatherers, usually with clinical credentials. In many organizations, these hunter-gatherers manually sift and collate multiple data sources to meet the varied requirements of the recipients of different registries. Flat FHIR, assuming it were production-ready, will certainly help, but it is unlikely that it could provide all, or even most, of the information needed for the range of public health reporting programs.

MIPS and APM Concerns

HHS acknowledges that providers are less than thrilled with aspects of the Quality Payment Program (QPP). HHS wants to know how PI for hospitals can better “align” with the requirements for eligible clinicians under MIPS and Advanced APMs. In particular, it wants ideas about how to reduce the reporting burden for hospital-based MIPS-eligible clinicians. It is undoubtedly looking for market-acceptable ideas to reduce the reporting burden where it is arguably more deeply felt – among non-hospital-based MIPS-eligible clinicians. While reducing or eliminating the reporting burden would help such providers, the big unanswered question, as it is with hospitals, is the burden of getting to 2015 Edition CEHRT.

Mandating Interoperability with Other Regulations

HHS also asks the industry how it could use existing CMS health and safety regulations and standards to further advance electronic exchange of information. It is ready to change Conditions of Participation (CoPs), Conditions for Coverage (CfCs), and Requirements for Participation (RfPs) for Long Term Care Facilities regulations to this effect. It wants to know whether requiring electronic exchange of medically necessary information in these regulations would move the interoperability needle.

Bottom Line

HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place. It roundly ignores the mesh of incentives that make stakeholders unwilling to share data and patients unable to access data. The industry has cried out for less process reporting and better insight into outcomes for years. This will accomplish the former but set the industry back with respect to the latter if interoperability is declared solved based on technology alone.

eCQMs: Beginning of a Long and Rocky Road

A New Necessity

Payment-based upon quality measurement is at the center of a wide range of efforts to improve health care performance by various payers including the federal government. Historically, quality measurement either had to rely upon either administrative data (e.g., claims) and/or a combination of manual chart abstraction.

Electronic clinical quality measures (eCQMs) derived from EHRs have been touted as a way to effectively scale the growing requirements for quality management program. Starting in CY/FY 2014, eligible providers in the EHR Incentive program beyond their first year of MU participation will have to submit a select number of eCQMs to CMS in order to successfully attest. Providers who fail to successfully attest will be subject to payment reductions of one percent in 2015, two percent in 2016, and three percent in 2017. eCQMs are also being utilized as a method for providers to submit quality data for other federal programs including the Hospital Inpatient Reporting Program and the Physician Quality Reporting System.

The Office of the National Coordinator for Health Information Technology (ONC) certifies that EHR technologies are capable of accurately calculating the eCQM results for the EHR incentive program. Initial findings though have found several issues with abstracting, calculating, and submitting eCQMs.

To read the full article, please submit your information below:

[bestwebsoft_contact_form]