2019 Predictions: M&A, Big Tech, and the Fate of ACOs
The Meaningful Use gravy train finally came to an end in 2018. As the strongest EHR vendors struggle to define new revenue streams, weaker ones faded from view through acquisitions or leveraged buy-out. Meanwhile, funding for ‘digital health’ start-ups continued to increase, though it likely hit the high water mark in 2018. And lest we forget, Amazon, Apple and Google continue their forays into the healthcare sector as the market is simply too big to ignore.
So what’s in store for 2019?
We brought together our analysts’ brain trust and came up with the following baker’s dozen of 2019 predictions. Over the near decade of making these annual predictions, our batting average has consistently been well above .500, so don’t ever say we didn’t give you an advanced warning on the following:
Revenue cycle management M&A activity will continue to pick up with the most notable acquisition by Optum as it doubles down on its Optum 360 managed revenue cycle business and acquires Conifer Health Solutions from Tenet.
Despite the hype and media attention around alternative primary care clinics (e.g. Oak Street Health, Chen Med, One Medical), the actual number of physical locations serving patients will remain paltry at less than ten percent of the number of retail health clinic locations.
Walgreens will likely make the first move to acquire Humana in 2019, but Walmart will outbid Walgreens to win Humana over.
The number of FDA approvals for algorithms in 2018 was impressive and shows no signs of abating. Additionally, 2020 will see a further tripling of regulatory approvals for AI.
Consumers’ use of telehealth will continue to see rapid growth and rising competition leading to significant consolidation among the plethora of vendors. By year-end, a major non-healthcare-specific consumer brand will join the mix, and the market will be down to five direct-to-consumer (DTC) nationwide brands.
By the end of 2019, every major healthcare analytics vendor will provide a cloud-hosted offering with optional data science and report development services.
Cloud offerings have become far more robust, concurrent with HCOs’ struggles to recruit IT talent and control costs. Amazon’s AWS and Microsoft’s Azure will be clear winners while Google’s own cloud infrastructure services will remain a distant third in 2019.
Laws and regulations to-date have not compelled providers to freely share data with patients. ONC’s information blocking rule, which will be released before the end of 2018, will make it easier to transfer data to other organizations but will do little to open the data floodgates for patients, clinicians, and developers.
Despite loud protests, the vast majority of provider-led MSSP ACOs will take on downside-risk as CMS shows flexibility in waivers. However, hospital-led ACOs, who continue to struggle with standing up a profitable MSSP ACO, will exit the program in 2019.
Continued changes in post-acute care reimbursement, especially from CMS, combined with the migration to home-based services, puts further economic strain on these facilities. Nearly twenty percent of post-acute care facilities will shutter or merge in 2019.
The warning signs are there over the last couple of months that the stock market has become skittish. This will extend well into 2019 (if not lead to a mild recession). It will hardly be an ideal time to do an IPO, and those planned by Change Healthcare, Health Catalyst and others will wait another year.
Elon Musk will have a nervous breakdown leading him to reinvent the healthcare system from his bed during his two-week recovery at Cedars-Sinai.
Matt Guldin · 2 years ago
Chilmark Team · 1 month ago
Chilmark Team · 2 months ago
Brian Edwards · 2 weeks ago
Revisiting Our 2018 Predictions
As is our custom here, we like to look back on our predictions for the closing year and see just how well we did. Some years we do amazingly well, others we over-reach and miss on quite a few. For 2018, we got seven of our 13 predictions spot-on, two were mixed results and four predictions failed to materialize. If we were a batter in the MLB we would have gotten the MVP award with a .538 batting average. But we are not and have to accept that some years our prediction average may hover just above the midpoint as it did this year.
Stay tuned, 2019 predictions will be released in about one week and it is our hope that they will inspire both rumination and conversation.
(Note: the bigger and plain text are the original predictions we made in 2017, while the italic text is our review of 2018).
Major mergers and acquisitions that mark the end of 2017 (CVS-Aetna, Dignity Health-CHI and rumored Ascension-Providence) will spill over into 2018. Both Humana and Cigna will be in play, and one of them will be acquired or merged in 2018.
MISS – neither happened. However, Cigna did pick-up PBM service Express Scripts and rumors continue to swirl about a possible Humana-Walmart deal or more recently, even a Walgreens-Humana deal.
Hot on the health heels of CVS’ acquisition of Aetna, growth in retail health reignites, albeit off a low overall footprint. By end of 2018, retail health clinic locations will exceed 3,000 and account for ~5% of all primary care encounters; up from 1,800 and ~2%, respectively, in 2015.
MISS – Modest growth in 2018 for retail health clinics with an estimate of around ~2,100 by year’s end. Telehealth, which is seeing rapid growth and on-site clinics may be partially to blame.
In a bid to one-up Samsung’s partnership with American Well, and in a bid to establish itself as the first tech giant to disrupt healthcare delivery, Apple will acquire a DTC telehealth vendor in 2018.
MISS – Apple continues to work on the periphery of care with a focus on driving adoption of its Health Records service in the near-term with a long-term goal of patient-directed and curated longitudinal health records.
Despite investments in population health management (PHM) solutions, payers still struggle with legacy back-end systems that hinder timely delivery of actionable claims data to provider organizations. The best intentions for value-based care will flounder and 60% of ACOs will struggle to break even. ACO formation will continue to grow, albeit more slowly, to mid-single digits in 2018 to just under 1,100 nationwide (up from 923 as of March 2017).
HIT – MSSP performance data showed only 34% earned shared savings in 2017 (up from 31% in 2016) and by year’s end it is estimated there will be ~1,025 ACOs in operation.
While some of the major EHR vendors have announced support for write access sometime this year and will definitely deliver this support to their most sophisticated customers, broad-based use of write APIs will happen after 2018. HCOs will be wary about willy-nilly changes to the patient record until they see how the pioneers fare.
HIT – FHIR-based read APIs are available from all of the major EHR vendors. Write APIs are still hard to find. To be fair, HCOs as a group are not loudly demanding write APIs.
True cloud-based deployments from name brand vendors such as AWS and Azure are in the minority today. But their price-performance advantages are undeniable to HIT vendors. Cerner will begin to incent its HealtheIntent customers to cloud host on AWS. Even Epic will dip its toes in the public cloud sometime in 2018, probably with some combination of Healthy Planet, Caboodle, and/or Kit.
HIT – adoption of cloud computing platforms is accelerating quickly across the healthcare landscape for virtually all applications. Cloud-hosted analytics is seeing particularly robust growth.
Providers will continue to lag behind payers and self-insured employers in adopting condition management solutions. There are two key reasons why: In particular, CMS’s reluctance to reimburse virtual Diabetes Prevention Programs, and in general, the less than 5% uptake for the CMS chronic care management billing code. In doing so, providers risk further isolation from value-based efforts to improve outcomes while controlling costs.
HIT – Awareness of the CCM billing code (CPT code 99490) remains moderate among providers and adoption is still estimated at a paltry less than 15%.
Mobile accessibility is critical for dynamic care management, especially across the ambulatory sector. More than 75% of provider-focused care management vendors will have an integrated, proprietary mobile application for patients and caregivers by end of 2018. These mobile-enabled solutions will also facilitate collection of patient-reported outcome measures, with 50% of solutions offering this capability in 2018.
MIXED – While the majority of provider-focused care management vendors do have an integrated mobile application (proprietary or partnership), collecting PROMs is still a functionality that remains limited through an integrated mobile solution.
A wide range of engagement, PHM, EHR, and care management solutions will make progress on documenting social determinants of health, but no more than 15% of solutions in 2018 will be able to automatically alter care plan interventions based on SDoH in 2018.
HIT – despite all the hoopla in the market about the need to address SDoH in care delivery, little has been done to date to directly affect dynamic care plans.
The hard, iron core of this issue is uncertainty about its real impact. No one knows what percentage of patients or encounters are impacted when available data is rendered unavailable – intentionally or unintentionally. Data blocking definitely happens but most HCOs will rightly wonder about the federal government’s willingness to go after the blockers. The Office of the National Coordinator might actually make some rules, but there will be zero enforcement in 2018.
MIXED – Last December we said, “The hard iron core of this issue is uncertainty about its real impact.” Still true. Supposedly, rulemaking on information blocking is complete but held up in the OMB. The current administration does not believe in regulation. So “data blocking” may be defined but there was and will be no enforcement or fines this year.
Providers will pull back on aggressive plans to broadly adopt and deploy PHM solution suites, leading to lackluster growth in the PHM market of 5% to 7% in 2018. Instead, the focus will be on more narrow, specific, business-driven use cases, such as standing up an ACO. In response, provider-centric vendors will pivot to the payer market, which has a ready appetite for PHM solutions, especially those with robust clinical data management capabilities.
HIT – PHM remains a challenging market from both payment (at-risk value-based care still represents less than 5% of payments nationwide) and value (lack of clear metrics for return on investment) perspectives. All PHM vendors are now pursuing opportunities in the payer market, including EHR vendors.
This is a case where the threat of alert fatigue is preferable to the reality of report fatigue. Gaps are important, and most clinicians want to address them, but not at the cost of voluminous dashboards or reports. A single care gap that is obvious to the clinician opening a chart is worth a thousand reports or dashboards. By the end of 2018, reports and dashboards will no longer be delivered to front-line clinicians except upon request.
MISS – Reports and dashboards are alive and well across the industry and remain the primary way to inform front-line clinicians about care gaps.
Arterys, Quantitative Insights, Butterfly Network, Zebra Medical Vision, EnsoData, and iCAD all received FDA approval for their AI-based solutions in 2017. This is just the start of AI’s future impact in radiology. Pioneer approvals in 2017 — such as Quantitative Insights’ QuantX Advanced breast CADx software and Arterys’s medical imaging platform — will be joined by many more in 2018 as vendors look to leverage the powerful abilities of AI/ML to reduce labor costs and improve outcomes dependent on digital image analysis.
HIT – With about a month left in 2018 the count of FDA approved algorithms year to date is approaching 30 and could potentially hit three dozen by year end. This is a significant ramp up in the regulatory pipeline, but more is needed in the way of clear guidance on how they plan to review continuously learning systems and best practices for leveraging real-world evidence in algorithm training and validation.
What do you think of 2018 for health IT?
TEFCA Response Drives ONC to Reconsider
The public response to ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA) and U.S. Core Data for Interoperability (USCDI) underscores the pent-up demand for better healthcare interoperability. Required by the 21st Century Cures Act, TEFCA and USCDI could impact the future of healthcare data portability in the same way that the EHR Incentive Program shaped EHR adoption. The goal is to catalyze better data availability needed to modernize provider and payer IT application portfolios. Taken at face value, these regulations could profoundly change the way that healthcare stakeholders provide and use each other’s data.
This isn’t ONC’s first hands-on attempt to crack the interoperability nut. Years ago, it awarded a cooperative agreement to DirectTrust to oversee the rollout of secure messaging across the industry. In return, it got a slow but steady adoption curve that has moved the needle but fallen short of providing universal data availability.
TEFCA and USCDI could impact the future of healthcare data portability in the same way that the EHR Incentive Program shaped EHR adoption.
This time around, ONC has higher hopes for a more accelerated and effective result. While the former effort relied on established technology, this proposal leans heavily on technology that does not exist – in or out of healthcare – and an architectural approach with a checkered history. The DirectTrust effort combined carrot and stick incentives, while this one offers only the potential for protection from the stick.
TEFCA proposes to join clinicians, hospitals, payers, and others in a national network of networks. It would designate a set of organizations called “Qualified HINs” (QHINs) to facilitate a standardized “on-ramp” for nationwide healthcare connectivity. A separate organization called the Recognized Coordinating Entity (RCE), to be funded by ONC through a cooperative agreement, would manage and oversee this network of QHINs. Once a user or organization is on this national on-ramp, it will be able to exchange data for three different use cases across six permissible purposes.
USCDI defines the data types eligible – or required, depending on how you read it – for exchange. It builds on the Common Clinical Data Set (CCDS) and will expand the roster of exchangeable data types eligible based on maturity and market acceptance. In the short term, it proposes to add structured and unstructured notes, and data provenance to the list of CCDS data items.
The response to this proposed regulation was not only voluminous by the standards of ONC rulemaking, but also filled with questions, concerns, requests for clarification, and conflicting interpretations. The sheer number of distinct issues raised by commenters is evidence of the complexity of the topic, the difficulty of ONC’s task, and the problems with the proposal. Healthcare stakeholders want and need to leverage existing investments, but ONC’s new plan looks more like rip-and-replace. ONC’s insistence that TEFCA is voluntary and non-regulatory is cold comfort to any organization trying to forecast how much it will cost to comply.
Before delving into the specific concerns, it’s important to note that the general approach of TEFCA and USCDI moves the conversation in the right direction. It could turn out to be less costly for providers to sign up for one network than it is to enroll in the multiplying number of purpose-built clinical and financial networks. TEFCA also emphasizes query-based exchange of discrete data elements, which represents a desirable interoperability outcome.
ONC has to be pondering its options in the wake of such a stiff and unified response. The 21st Century Cures Act specifically requires it to develop TEFCA and establish rules for data blocking. The Senate HELP committee has already sought testimony on ONC’s progress on both fronts. Besides Congress, the highest levels of the current administration want action to solve the problem of data liquidity. ONC has to act, and soon.
Prior to the public response to TEFCA, ONC was on course to award the RCE cooperative agreement to the Sequoia Project on the strength of its handling of ONC’s transition of implementation and maintenance of the eHealth Exchange, and the governance muscle Sequoia showed as evidenced by Carequality’s adoption curve. This is relatively unsurprising since few other existing organizations have the independence, HIT expertise, scale, and track record that Sequoia has assembled.
ONC also seemed destined to declare that TEFCA compliance would provide a safe harbor against data blocking liability. ONC chose not to define data blocking and TEFCA in a single set of regulations. Instead, it asked the industry to accept new “rules of the road” for information exchange without knowing what happens when an organization can’t comply with those rules. ONC did not help its case by forcing the industry to conclude that TEFCA compliance is the only way to qualify for the data blocking safe harbor. Every conversation we’ve had about TEFCA since January is primarily about this linkage. Such a requirement is especially controversial among the smaller organizations that make up the bulk of healthcare, to the extent they are even aware that such a rule could happen. That ONC thought it could deal with data blocking in two regulatory steps is hard to understand.
Our prediction is that ONC will seek another round of public comments and revise TEFCA based on what it hears. It will then release a final version late this year that preserves the overall structure of TEFCA and USCDI from the current proposal. ONC will deal with objections by adding exemptions and exceptions that will phase in some of its more stringent provisions. Most of the exemptions and exceptions will concern TEFCA’s provisions for the data blocking safe harbor and population-level query. Effectively, it will delay the operation of these provisions to some later date. It will probably go ahead and award the cooperative agreement for RCE to the Sequoia Project once TEFCA is in final form late this year.
With all eyes on TEFCA, ONC has been mum about the fate of the “EHR reporting program,” also required by the Cures Act. This program is intended help hospitals, clinicians, and other users of health information technology to better understand how these products support interoperability and usability. ONC originally claimed it could not act on this requirement, pleading lack of budget. Then Congress rejected HHS’ request for a $22 million reduction in ONC’s budget, presumably enabling ONC to develop the program.
Driving Policy Without Healthcare Organizations: A Fool’s Errand
Healthcare in the United States faces many problems, but one of the bigger ones is bringing the right stakeholders to the table when it’s time to try to solve a problem. Often, the empty chair should be occupied by an individual – the overwhelmed patient, the uncompensated caregiver, the burned-out doctor or nurse. Rarely is an institution not represented.
That’s why the recent Accenture recent white paper (PDF) on the role of patient-generated health data (PGHD) in contributing to clinical research and improving healthcare delivery left me scratching my head. The document sets forth a good roadmap for developing federal policy, and it examines the role of the individual clinician in using PGHD, but it omits a key stakeholder: The healthcare organization (HCO) itself.
First, a bit of background. The Office of the National Coordinator for Heath IT contracted with Accenture in October 2015 to draft a white paper that ONC would use to develop a policy framework for capturing, using, and sharing PGHD. Such a framework is called for in ONC’s 2015 interoperability roadmap, which set the ambitious goal of achieving interoperability by 2024. Accenture completed its research in October 2016 and published a draft of the white paper last month. Public comments are welcome; no deadline has been stated.
Admittedly, the white paper is clear on what needs to happen: PGHD needs to make its way into the clinical record so that care teams can make more informed decisions, patients can better engage with individual physicians as well as HCOs at large, and medical research can expand its reach far beyond the four walls of an institution. This will allow healthcare to shift from herky-jerky episodic care to more continuous longitudinal care.
So far, so good. However, the white paper goes on to identify researchers, clinicians, and patients and caregivers as the key stakeholders in the use of PGHD. Policymakers, technology vendors, and payers and employers are the secondary stakeholders. Provider organizations are mentioned in passing but not identified as major stakeholders.
The policy framework focuses on the needs and challenges of individual medical professionals without addressing the larger needs of the institutions that employ them. Consider, for example, the characterization of organization-wide use of PGHD:
Innovative health care organizations have incorporated the use of PGHD into their current workflows in ways that prevent burdening the care team with extra work or overwhelming them with extraneous data. The care team needs to share responsibilities among team members to reduce the burden of collecting, verifying the quality and provenance, and analyzing PGHD. Some organizations have assigned specific members of the care team to review PGHD, determine where to store the data, notify providers of abnormal values, and respond to the patient. In the future, the EHR will be able to f
acilitate PGHD review, helping to simplify the clinician workflow.
Incorporating PGHD into current workflows, sharing responsibilities, and assigning care team members to view data (when it is likely not part of their training) takes years of internal review and discussion – to say nothing of assuming that EHRs will all of a sudden accept PGHD and at the same time simplify the clinical workflow when, to date, they have done neither.
Considering the extent to which HCOs will be expected to collect, curate, normalize, store, secure, analyze, interpret, and share PGHD, either directly or indirectly, it’s irresponsible for a policy framework, even in its draft form, to gloss over the challenges and opportunities that providers face. HCOs need to be part of that conversation. If anything, they should host that conversation.
Show me the (criteria that must be met in order to receive the) money!
Of course, this isn’t the first time that federal health IT policy has descended from Washington with little input from HCOs. Previous policies worked, at least to a certain extent, because explicit incentives and penalties came along with them. (Some of the most popular content from my days as a journalist explained the intricacies of meaningful use criteria and deadlines, complete with hand-coded tables.)
As written, the PGHD policy framework mentions HCO incentives only in the context of new payment models – specifically, their “incentives to monitor a patient’s health status outside of an office visit to reduce the need for face-to-face encounters and to reduce patient use of emergency and inpatient care.”
It’s an implicit incentive, not an explicit one. That means it’s not going to attract the attention of burned-out doctors or nurses being asked to take on yet another uncompensated task, not to mention frazzled HCO leaders wrestling with tight budgets, tighter margins, and growing expectations.
Incentives and penalties will certainly get HCOs to notice. Crafting them will be difficult, though. Rewarding HCOs for using data from connected, healthy patients (like yours truly) will “check the box” but won’t move the needle on VBC. Insisting that all data come from disconnected, unhealthy patients (like the millions who require continuous longitudinal care) will doom PGHD initiatives before they can even start.
Understanding how to write those policies, and then putting them into place, will be a heck of a lot easier if HCOs themselves have a say in what they look like. And that will be a heck of a lot easier if HCOs have a say in what the final PGHD policy framework looks like.
HIMSS’15: NOT on FHIR
HIMSS15 was supposed to be an opportunity for HIT vendors to really expand on interoperability with possibilities represented by FHIR and other newish standards. At least that’s what I thought.
Interoperability is an overarching concern across HIT and many are expecting big things from FHIR. Only Epic proclaimed its faith flashily with a stylized HL 7 FHIR logo inside an actual fireplace. But the company gave some people who asked about data integration the bum’s rush while also announcing that they would not be charging — at least for a few years — for data transfers between Epic customers and non-Epic customers with Care Elsewhere. Very confusing. Otherwise, the hyping of FHIR at a poster or display level was way more understated than I expected. Inside meeting rooms on the other hand, FHIR came up in every conversation I had.
These conversations fell into two patterns. The first, and most numerous, followed a familiar script: we are closely monitoring this new technology, recognize its enormous potential, and will evaluate how and when to build solutions based on what makes sense for our customers.
The second set of conversations sounded decidedly less cautious. These vendors expressed strong optimism about the benefits of FHIR as a standard but were essentially vague about product plans. While the interoperability issues facing healthcare are far broader than FHIR, I was still expecting a little more substance than was on display in Chicago, especially concerning all the hype we’ve been hearing about interoperability in recent months.
For those who crave substance, a well-attended session by David McCallie of Cerner and Sam Huff of Intermountain set the tone and made the conference for me. This presentation described one potential way to a more interoperable future. These two interoperability stalwarts position the EHR as the system of record and pluggable lightweight applications as the system of engagement for healthcare. Connecting them and providing the data fuel will be a set of FHIR-based APIs. The youngish crowd in the big hall asked a lot of questions and seemed genuinely ready to seize on this approach as a way to penetrate the walled gardens of our EHR-mediated HIT landscape.
I was pretty focused on FHIR this year because it will be an important element in solving the broader interoperability problem. And on that front – the Friday before HIMSS15, ONC issued a report to Congress on information blocking in healthcare.
Not too surprising due to timing, at HIMSS, no one that I talked to had read it and most seemed unaware of it.
The gist of this report is that information blocking is most definitely a problem. ONC’s most significant findings were that the scope of the problem is not well understood, the causes are a bit more murky than the simple competitive concerns of EHR vendors and large providers. The key takeaway is that business practices, rather than technology, may be the primary cause that information blocking occurs and by inference, the lack of interoperability.
Since HIMSS15, I have had several conversations with both vendors and providers who mention this report, usually in passing, as a way to illustrate how far we have to go in healthcare to making patient data as portable as it needs to be to deliver truly coordinated care. Next year, I am hoping that vendors will be talking in more concrete terms about the ways that they have implemented FHIR-based data access for HCOs.
ONC Catalyzing a National Interoperability Plan
ONC’s first draft of a nationwide interoperability roadmap is ambitiously vast in scope, but ultimately constrained by the past. Its purpose is to initiate a discussion within healthcare about the ways and means to achieve interoperability in 10 years notwithstanding that the discussion is already the obsession of many. It hopes that this document will launch a process that results in a national private-public strategy for supporting the kind of interoperable data infrastructure that will enable a learning health system.
The roadmap focuses on several major policy and technical themes: the impact of FFS on vendor and provider attitudes to data sharing, the potential for private payers and purchasers to incentivize data sharing, the central importance of standards and incentivizing compliance. This long, comprehensive, and in places incredibly detailed, document is really three documents in one. For those lacking the time to read through its 160+ pages, we summarize:
Part I – Letter from ONC chief Karen DeSalvo and executive summary that lays out a set of questions intended to guide the response to the overall document as well as a series of “calls to action” to galvanize industry participation. (Pages 1-15)
Part II – Exhaustive presentation of the current and potential future state of interoperability, as well as the challenges and opportunities that lay between. (Pages 6-162)
Part III – The final appendix containing 56 “Priority Interoperability Use Cases” which ONC wants to winnow down. (Pages 163-166)
Part I gives a sense of which way ONC is leaning by focusing on select high-level issues: payment policy, data governance, semantic and transactional standards, measurement of results, and probably most important, the priority use cases. The language used within the roadmap is reflective of a sea change in thinking: “send and receive” has been replaced with “send, receive, find, and use” as a way to describe what individuals need from interoperable HIT.
Part II lays out the elements of what it hopes will become the national roadmap for interoperable HIT. The roadmap’s focus on measurement – tracking and gauging the metrics of interoperability – is eerily similar to the EHR Incentive Program’s MU measures right down to using various numerators and denominators. If we learned anything from the EHR Incentive Program it is that the industry liked the incentives but disliked the actual MU objectives. Absent the carrot of incentives and/or the sting of penalties, it is hard to see how ONC can catalyze providers to embrace yet another set of operational metrics.
ONC continues to struggle with patient matching. We know that Congress will not countenance so much as a voluntary (on the part of patients) national patient identifier. Even if it did, the costs to the industry of maintaining a dual system would only add complexity to an overburdened system. Unfortunately, we think that ONC and HHS is powerless to change this extravagantly costly element of the interoperability conundrum.
Who Uses What Data?
Embedded in ONC’s treatment of HIPAA, data governance, and data portability lies an essential, unresolved issue: The rights and responsibilities of the various stakeholders with respect to the data governance. How exactly can personal health information (PHI) that is captured by clinical applications be shared within the context of care delivery?
Today, such rules of the road remain unclear, ambiguous, and deeply complex. Existing laws, both state and federal, are a patchwork in which various participants hold back data fearing liability where most often no liability exists. At the same time, various participants claim outright to ”own” this data and use this patchwork to consolidate their competitive position.
ONC rightly points out that ATM networks and airline reservation networks provide interoperability for radically simpler use cases than the health system requires (they also don’t quite have the regulatory complexity of health data). While true, the data practices of consumer-focused transaction networks are effectively incomprehensible for the average consumer. So why should we assume that therefore somehow how we need to make it comprehensible for this use case of data?
Is it reasonable to expect any patient to understand the protections offered in the vastly more complex health data realm? Resoundingly no. ONC could be soliciting input about comprehensive reconceptualization of data rights and responsibilities with respect to patient data. Admittedly, only Congress can act to change the existing regime, and even Congress is limited by legislation enacted at the state level.
This is one aspect of interoperability that will aways confound those seeking true interoperability and data exchange in the context of health. It is also why some proponents believe that interoperability will only truly occur once the patient has full control of their data (health data bank) and defines access to their PHI. But even here, we have a very long way to go before the majority of citizens take upon that responsibility.
Standards and Compliance
Most of the ideas embodied in the JASON Report and the subsequent JASON Task Force are offered up for public comment. The roadmap suggests that HL7’s new standard, FHIR, could be effective in the 6-10 year timeframe, considerably longer than the time contemplated by the Argonaut Project. ONC also stops short of saying that element-centric interoperability will or even should replace document-centric exchange. But it talks about electronic sharing of summary care records – not documents – between hospitals, SNFs, and home health agencies.
The roadmap reinforces and effectively doubles down on the centrality of standards in any plan to foster better interoperability. Borrowing liberally if not literally from the Common MU Data Set, ONC wants to know how it can help make data more intelligible inside and between providers. As a companion to the roadmap it also released an “Advisory” on the most common standards in order to get opinions on where the industry’s best practices focus should be. ONC believes that standards-compliance and the elimination of non-conforming data representations will pay dividends.
The counterpoint to this emphasis on standards is the view held by many smaller, often start-up vendors that see standards as a means to preserve the status quo, serving more of gatekeeping function than an enabling function. Data networks in other industries, while admittedly simpler, do not rely on prescriptive application-to-application data representation standards. Healthcare is the only industry with such an ornate implementation of level 7 of the OSI stack. Smaller vendors would rather see simpler standards, published APIs, or more of a focus on the results of exchange than on how the result is to be achieved.
The reality is that major HIT vendors and major providers grumble about prescriptive requirements but by and large remain deeply committed to standards and compliance. We think that ONC could have at least offered up the prospect of achieving interoperability goals without specifying the mechanism down to specific data elements. Unfortunately, ONC appears to be continuing upon its questionable path of highly prescriptive guidelines – guidelines that ultimately hinder innovation rather than create opportunities for innovation to flourish.
Read the Priority Use Cases Right Now
Part III contains 56 uses cases, obviously culled from users, and are a dog’s breakfast of highly specific and extremely general interoperability complaints. ONC is asking the industry to help it order and prioritize the vast range of technical and policy question that they raise.
We recommend that if you read nothing else in the entire roadmap, you should read these use cases because they sound more like demands than actual use cases.
For example, Item #8 – certified EHR technology (CEHRT) should be required to provide standardized data export and import capabilities to enable providers to change software vendors. Every HIT vendor believes publishing database schemas is the last stop on the way to mob rule. In case vendors as a class were uncertain about their reputation among at least some providers, this “use case” provides unambiguous feedback.
A large number of the use cases are decidedly patient-centric despite the decidedly provider-centric orientation of the wider healthcare system and significant resistance to any kind of reorientation. A significant number of the use cases are related to payment and administrative uses even though the roadmap’s focus is on clinical data and clinical uses of the data. There are also a large number of use cases related to support for the public health system and clinical research. Both of these constituencies unfortunately take a back seat to immediate patient care and payment priorities.
The roadmap mentions, without analysis, the fundamental problem of FFS-based disincentives to data sharing. HHS has recently announced new goals for progress on the way to VBR but ONC has little leverage to do much more.
Another important issue that the roadmap does not and likely can’t address is the level of investment in IT by healthcare providers. While many yearn for an interoperable infrastructure comparable to what banking or retail enjoy, those industries spend far more, as a percentage of revenue, on IT than healthcare providers. Progress on EHR adoption was not a result of provider’s reallocating resources to technology adoption, but federal incentives under the HITECH Act.
Therefore, can we really expect HCOs to increase IT budgets to support interoperability? Probably not. Moreover, ONC and more broadly HHS, do not have the funding to support interoperability adoption on the scale of EHR adoption via the HITECH Act, absent congressional action. Most HCOs are cash-strapped and struggling with a multitude of changes occurring in the marketplace and frankly have a fairly poor record in the effective adoption, deployment and use of IT in the context of care delivery. This is a knowledge intensive industry that has done a pretty lousy job at effectively harnessing that knowledge via IT.
The only leverage ONC and HHS have to improve interoperability is payment incentives via CMS. Recently, HHS announced that it will accelerate the move to VBR. Following closely on that announcement was the formation of the Healthcare Transformation Task Force, an industry association that sees its task as helping the industry migrate to VBR. It is far more likely that organizations such as this in conjunction with payment reform will do far more to achieve interoperability than any prescriptive roadmap.
It may be high time for ONC to step back and let the industry tackle this one on their own for only they will truly have a vested interest, via payment reform, to share PHI in the context of care delivery across a community in support of the triple aim and population health management.
Incentives, Regulations and Consequences
Good intentions do not always result, in the long-term, in good policy. Such may be the case with the HITECH Act that was passed as part of the huge stimulus bill ARRA in 2009. This bill launched the massive adoption of EHRs by physicians and hospitals across the country, with current adoption numbers of a basic EHR in hospitals at well over 55% from a paltry less than 10% pre-ARRA. Similar trends in EHR adoption can also be found among ambulatory practices.
There is no question that indeed, the HITECH Act has achieved one of its primary objectives – foster the adoption, via incentives, of certified EHR technology (CEHRT). This is truly a good thing, for only by digitizing health data can we then move on to further public health policy goals of beginning to understand what actually contributes to health and well-being (comparative effectiveness), and also move towards a model of personalized medicine and true patient engagement.
But at what point does the government’s role in fostering adoption of CERT end and market forces begin?
A Little History:
To foster adoption of CEHRT but also ensure that tax payers (after all we’re the ones footing the bill for these incentives) get value from said adoption, ONC pulled together a number of workgroups to define “meaningful use” requirements that physicians and hospitals would need to demonstrate to get their incentive payments. This was broken up into three “Stages” with each stage building upon the previous.
The first stage of “meaningful use” requirements were pretty simple as the plan was to just get the medical establishment to begin adopting CEHRT and familiarize them with usage of this tech. The incentive payments were also front-end loaded (receive more for meeting stage one than subsequent later stages) so low and behold, we saw strong adoption and attestation for stage one. Hip, hip hooray were the cheers heard at the Hubert Humphrey building in DC.
Where We Are Today:
But that low barrier to stage one adoption created a false market for EHR technology. There is now a plethora of EHR vendors, especially on ambulatory side that frankly should have never made it this far.
Meaningful use stage two requirements for certification are a significant hurdle for many of these EHR vendors who simply do not have the resources, nor technical chops to meet them. Sadly, a lot of ambulatory practices will suffer as a result. This in large part led to the proposed rule released this week by CMS to allow providers to postpone attesting with stage 2 CEHRT this year and allow them to attest with 2011 CEHRT. It is CMS’s hope that this delay will provide EHR vendors the time to get their act together and be certified for stage two as well as provide sufficient time for providers to adopt these updated systems to attest.
Time to Step Out of Way and Let Market Takeover:
But as often happens with government initiatives, initial policy to foster adoption of a given technology can have unintended consequences no matter how well meaning the original intent may be.
During my stint at MIT my research focus was diffusion of technology into regulated markets. At the time I was looking at the environmental market and what both the Clean Air Act and Clean Water Act did to foster technology adoption. What my research found was that the policies instituted by these Acts led to rapid adoption of technology to meet specific guidelines and subsequently contributed to a cleaner environment. However, these policies also led to a complete stalling of innovation as the policies were too prescriptive. Innovation did not return to these markets until policies had changed allowing market forces to dictate compliance. In the case of the Clean Air Act, it was the creation of a market for trading of COx, SOx and NOx emissions.
We are beginning to see something similar play-out in the HIT market. Stage one got the adoption ball rolling for EHRs. Again, this is a great victory for federal policy and public health. But we are now at a point where federal policy needs to take a back seat to market forces. The market itself will separate the winners from the losers.
The move to value-based reimbursement (VBR) will force healthcare organizations of all sizes to adopt some aspect of population health management. Interoperability, the big sore point today is not so much a technology issue as it is a market issue – and population health management is impossible without interoperability. While I know that the new ONC director, Karen DeSalvo is well-meaning in her intentions, interoperability is something that market needs to sort out, not ONC. My fear is that by letting ONC/CMS define interoperability, we will be left with highly prescriptive definitions and not innovative models, which this market desperately needs.
I applaud the hard work and efforts of all the public servants of HHS and volunteers who have worked tirelessly to get us to the point of where we are today. However, it is now time for them to refocus their efforts elsewhere. Maybe a good place to start is to assist all those ambulatory practices that have adopted a CEHRT under stage one to assist them in the transition to a more viable and stable EHR vendor for the long-term. Then again, maybe this is just an issue of caveat emptor.
Three Big Questions for Stage 3 & Patient Engagement
For many, the delay of Stage 3 of the Meaningful Use program evoked a collective sigh of relief, providing a much-needed extra year to focus on the challenging requirements for patient engagement and interoperability. As distant as 2017 may seem however, the preparation for Stage 3 is already underway in Washington; the vendor community and providers will soon be scrambling to follow suit.
Barring further delays, the timeline is as follows: This fall CMS will release the notice of proposed rulemaking (NPRM) for Stage 3 and the corresponding NPRM for the Standards and Certification Criteria. The former is the programmatic framework for what to expect – measures, percentages, reporting requirements, etc., while the latter is the technical product guidelines for software vendors to follow in order to receive ONC certification as a Stage 3 compliant solution that will enable their customers, if properly implemented and used, to collect those sought-after incentive dollars. The final rule is expected to drop sometime in Q1-Q2 of 2015 – just one year away.
But that doesn’t mean there’s a year to put off thinking about it. In a few short weeks, the Health IT Policy Committee (HITPC) is set to deliver an official recommendation on the topic of Stage 3’s patient engagement requirements to the ONC. From all indications, it appears this super-group of wonks will press for inclusion of patient-generated health data (PGHD – yet another #ONCronym for your twitter streams) into electronic health record systems. The technical experts have defined PGHD as follows:
“health-related data—including health history, symptoms, biometric data, treatment history, lifestyle choices, and other information—created, recorded, gathered, or inferred by or from patients or their designees (i.e., care partners or those who assist them) to help address a health concern.”
At first glance, this is a no-brainer, as we’ve been hearing the clarion calls for such inputs for the better part of the last decade. 60 percent of US adults claim to track their weight, diet, or exercise routine, according to the Pew Research Center’s data. Evidence for the positive impact of this data on quality, satisfaction, and in some cases cost is thin but growing.
But as we are learning through the first two stages of this program as well as the early headaches of ACA rollout, reams of sophisticated studies floated down from the ivory tower do not effective policies make. Despite the need for PGHD, when it is wonkified, ONCified, and held to the temple of the nation’s delivery system, there may be a small disaster in waiting. Below are three questions Chilmark is keenly tracking throughout the remainder of 2014:
What Constitutes PGHD?
The language used thus far raises much speculation about what exactly this inclusion will mean when it hits the front lines. The definition provides only a general description, leaving a lot of possibility for interpretation and application down the road. For many, PGHD evokes the notion of datastreams from the vast array of health and wellness devices such as fitbits and jawbones, Bluetooth medical devices, and of course, tracking apps. Yet the definition above makes PGHD seem to carry more of an health risk assessment (HRA)-like utility, where patients fill out a survey and have it sent to their doctors in advance. Yet another angle is the notion of patient-reported outcomes: clinically oriented inputs from patients with regard to their physical and psychosocial health status. Outfits like ATA, HIMSS and others are lobbying for full inclusion of patient-monitoring and home-health data.
Each of these use cases brings with it a unique set of programmatic and technical components. A popular example as of late is with biometric data: If a panel of diabetic patients are all given Bluetooth glucometers that input into respective EHRs, then what – Will someone monitor each of them? Or are HCOs expected to fit those data into an algorithm that alerts and ultimately predicts any aberrance? This has been referred to as providing doctors with ‘insight’ rather than raw data. That sounds snazzy, but can we realistically mandate the creation of insight?
Collecting data such as patient allergies or side effects appears a simpler use case on paper. Yet HITPC is appearing to use everyone’s favorite A+ students – IDN’s like Geisinger, Kaiser Permanente, and Group Health Cooperative among others as the basis for their recommendation. As one example, the report lauds GHC’s eHRA model, which is based on a shared EHR and shared clinical staff for data review. As nicely as that may work, Chilmark is skeptical that it’s reproducible in an average clinical setting. Generally, the innovators in the digital engagement space have been the insurers, not the providers. We understand the need to look at innovators in order to prescribe a path for the rest of the country, but in talking to regular folks at urban hospitals, community clinics, mid-sized IPAs –it’s more likely that fluid data is a byproduct of integrated systems, not the other way around.
How Will the Market Respond?
Despite its unpopularity in the C-suite, meaningful use has forced EHR vendors to pull their heads out of the sand and advance their product features. In addition to giving providers a break, part of the reason behind the Stage 3 delay was for vendors’ benefit: “[to provide] ample time for developers to create and distribute certified EHR technology…and incorporate lessons learned about usability and customization.” The Standards and Certification Criteria 2017 edition will play a big role in the next lurch forward, and one can be sure that those new mandated features will be all the rage at HIMSS 2015.
Yet at the broadest level, the evolution of EHRs (billing >> administration >> clinical) appears to be stalling. In exploring the patient engagement market and the to-date limited functionality of tethered patient portals despite Stage 2’s requirements one thing has become clear: EHR vendors will simply not just add new features for the sake of their customers (forget about patients). With new PGHD functionality emerging, we expect new companies to step up to the plate and seek modular ONC-ATCB certification
An example already underway is 3rd party data integration. Over the last few years, device manufacturers, startups, and third parties started seeing the value in injecting their data into EHRs. The emergence of middleware companies who provide integration as a service, such as Nanthealth, Corepoint, and Validic, will continue as PGHD requirements develop over the coming months. Similar companies will start (and already are) filling the void for HRA functionality, portal requirements, patient communication, and so on. We expect that this will only exacerbate the headache faced by CIOs & CMIOs with a long list of purchasing options. Startups take note: It should also set off a shopping spree by EHR companies and other enterprise vendors looking to buy rather than build. Allscripts acquisition last year of Jardogs is one such example.
Will Providers be Ready?
In a word, no. The inclusion of PGHD brings with it an avalanche of procedural and programmatic preparation: data review and quality assurance, governance models and new workflows, the prickly issue of data ownership, staff time and training, liability concerns, HIPAA extension of coverage, ever-increasing insurer coordination, clinician accountability, and of course, patient consent, onboarding, and marketing. With the last one, keep in mind that we now live in the post-Snowden era…
Of course, without details of the required measures, further hand-wringing is unwarranted at this point. But suffice to say there’s a small storm-a-comin.’ As the definitions, rules, and standards of patient-generated health data emerge, we look forward to what promises to be a rich commentary and response to the NPRM amidst the broader discussion in the health IT community throughout 2014.