2019 Predictions: M&A, Big Tech, and the Fate of ACOs
The Meaningful Use gravy train finally came to an end in 2018. As the strongest EHR vendors struggle to define new revenue streams, weaker ones faded from view through acquisitions or leveraged buy-out. Meanwhile, funding for ‘digital health’ start-ups continued to increase, though it likely hit the high water mark in 2018. And lest we forget, Amazon, Apple and Google continue their forays into the healthcare sector as the market is simply too big to ignore.
So what’s in store for 2019?
We brought together our analysts’ brain trust and came up with the following baker’s dozen of 2019 predictions. Over the near decade of making these annual predictions, our batting average has consistently been well above .500, so don’t ever say we didn’t give you an advanced warning on the following:
Revenue cycle management M&A activity will continue to pick up with the most notable acquisition by Optum as it doubles down on its Optum 360 managed revenue cycle business and acquires Conifer Health Solutions from Tenet.
Despite the hype and media attention around alternative primary care clinics (e.g. Oak Street Health, Chen Med, One Medical), the actual number of physical locations serving patients will remain paltry at less than ten percent of the number of retail health clinic locations.
Walgreens will likely make the first move to acquire Humana in 2019, but Walmart will outbid Walgreens to win Humana over.
The number of FDA approvals for algorithms in 2018 was impressive and shows no signs of abating. Additionally, 2020 will see a further tripling of regulatory approvals for AI.
Consumers’ use of telehealth will continue to see rapid growth and rising competition leading to significant consolidation among the plethora of vendors. By year-end, a major non-healthcare-specific consumer brand will join the mix, and the market will be down to five direct-to-consumer (DTC) nationwide brands.
By the end of 2019, every major healthcare analytics vendor will provide a cloud-hosted offering with optional data science and report development services.
Cloud offerings have become far more robust, concurrent with HCOs’ struggles to recruit IT talent and control costs. Amazon’s AWS and Microsoft’s Azure will be clear winners while Google’s own cloud infrastructure services will remain a distant third in 2019.
Laws and regulations to-date have not compelled providers to freely share data with patients. ONC’s information blocking rule, which will be released before the end of 2018, will make it easier to transfer data to other organizations but will do little to open the data floodgates for patients, clinicians, and developers.
Despite loud protests, the vast majority of provider-led MSSP ACOs will take on downside-risk as CMS shows flexibility in waivers. However, hospital-led ACOs, who continue to struggle with standing up a profitable MSSP ACO, will exit the program in 2019.
Continued changes in post-acute care reimbursement, especially from CMS, combined with the migration to home-based services, puts further economic strain on these facilities. Nearly twenty percent of post-acute care facilities will shutter or merge in 2019.
The warning signs are there over the last couple of months that the stock market has become skittish. This will extend well into 2019 (if not lead to a mild recession). It will hardly be an ideal time to do an IPO, and those planned by Change Healthcare, Health Catalyst and others will wait another year.
Elon Musk will have a nervous breakdown leading him to reinvent the healthcare system from his bed during his two-week recovery at Cedars-Sinai.
Matt Guldin · 2 years ago
Liz Gavriel · 4 years ago
John Moore · 2 months ago
Brian Edwards · 2 months ago
Brian Murphy · 1 week ago
Promoting Interoperability: MU Fades to Black
Seeking to liberate the industry from its self-created morass of siloed data and duplicative quality reporting programs, the Department of Health and Human Services (HHS) issued 1,883 pages of proposed changes to Medicare and Medicaid. It renamed the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs (known by all as Meaningful Use) to Promoting Interoperability Programs (PI).
As widely reported, it would eliminate some measures that acute care hospitals must report and remove redundant measures across the five hospital quality and value-based purchasing programs. It would also reduce the reporting period to 90 days. HHS will be taking comments until June 25, 2018.
HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place.
HHS believes that requiring hospitals to use 2015 Edition CEHRT in 2019 makes sense because such a large proportion of the hospitals are “ready to use” the 2015 Edition. Ready to use is not the same as using. 2015 Edition EHRs may not be as widely deployed as HHS indicates. The following 10 month old snapshot from ONC shows hospitals have not aggressively moved to adopt 2015 Edition CEHRT.
Current adoption levels by HCOs are undoubtedly better, and many vendors have 2015 Edition technology ready to go, but hospitals can only change so fast. The rush to get hospitals on the most current edition has to do with the most relevant difference between the 2014 and 2015 Editions – the API requirement. APIs will be the technical centerpiece of better, more modern interoperability but adoptions levels are still low. APIs, by themselves, offer the promise of better data liquidity. For this promise to become a reality, healthcare stakeholders need more than just a solid set of APIs.
HHS is also proposing that hospitals post standard charges and to update that list annually.
This is a nice thought, but it will take some heavy lifting to pull this off. For starters, HHS doesn’t even have a definition of “standard charge” and is seeking stakeholder input before the final rule is published. HHS also must determine how to display standard charges to patients, how much detail about out-of-pocket costs to include (for patients covered by public and private insurance), and what noncompliance penalties are appropriate.
Above all, there’s the thorny issue of establishing what a standard charge is in the first place. Charges vary by payer. Can a hospital truly state, without a doubt, the cost of an MRI or a colonoscopy? Most cannot – and technology alone will hardly solve this problem.
The existence of APIs will stand in the stead of the old view/download/transmit (VDT) requirement. Regarded as one of meaningful use’s most troublesome and fruitless requirements, this rule has been shed by HHS because of “ongoing concern with measures which require patient action for successful attestation.”
VDT is one of several MU Stage 3 requirements pertaining to patient engagement – along with providing secure messaging or patient-specific educational resources – that HHS has proposed dropping, under the pretense that it is “burdensome” to healthcare providers. While hospitals have struggled to get many patients to participate, the VDT requirement set the bar at one patient out of an entire population. What’s more, dropping the requirements fails to take into account how burdensome it is for patients to try to access their data, communicate with their physicians, and learn about their conditions and treatment options. It is also contrary to CMS Administrator Seema Verma’s remarks, first at HIMSS18 and again this week, indicating that the agency seeks to “put patients first.”
HHS says that third-party developed apps that use APIs will deliver “more flexibility and smoother workflow from various systems than what is often found in many current patient portals.” Whether such apps deliver “smoother workflow” is not a foregone conclusion.
HHS proposes “a new scoring methodology that reduces burden and provides greater flexibility to hospitals while focusing on increased interoperability and patient access.” The proposed scoring methodology uses a 100-point system (explained over 24 pages) in which attaining a score of at least 50 means there will be no Medicare (or Medicaid) payment reduction.
HHS is also mulling whether to abandon these measures altogether in favor of scores calculated at the objective level.
The biggest regulatory effort in recent months related to interoperability, other than this proposal, has been ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA), required under the 21st Century Cures Act. TEFCA, well along in the planning stages, is a new set regulations from ONC whose goal is to catalyze better data availability using APIs. HHS in this regulation wants public comment on whether participation in a TEFCA-compliant network should replace the process measures in Health Information Exchange objective. Stated another way: Should TEFCA compliance replace 80 percent of the score for PI (75 percent in 2020)?
TEFCA is widely expected to provide a safe harbor from data blocking liability although ONC has been mum on this point. TEFCA then could do double duty: Eliminate the need to meet or report on health information exchange metrics and provide a shield from data blocking enforcement.
But there are, as yet, unanswered questions about TEFCA:
HHS is also considering doing away with Public Health and Clinical Data Exchange objective. It floated the idea that a provider that supports FHIR APIs for population-level data would not need to report on any of the measures under this objective. This would replace 90 percent of the score for PI (85 percent in 2020) when combined with the TEFCA knockout.
The specific API mentioned, called Flat FHIR and still in development, will probably contribute to part of the complex process of public health and registry reporting. This activity currently requires highly skilled data hunter-gatherers, usually with clinical credentials. In many organizations, these hunter-gatherers manually sift and collate multiple data sources to meet the varied requirements of the recipients of different registries. Flat FHIR, assuming it were production-ready, will certainly help, but it is unlikely that it could provide all, or even most, of the information needed for the range of public health reporting programs.
HHS acknowledges that providers are less than thrilled with aspects of the Quality Payment Program (QPP). HHS wants to know how PI for hospitals can better “align” with the requirements for eligible clinicians under MIPS and Advanced APMs. In particular, it wants ideas about how to reduce the reporting burden for hospital-based MIPS-eligible clinicians. It is undoubtedly looking for market-acceptable ideas to reduce the reporting burden where it is arguably more deeply felt – among non-hospital-based MIPS-eligible clinicians. While reducing or eliminating the reporting burden would help such providers, the big unanswered question, as it is with hospitals, is the burden of getting to 2015 Edition CEHRT.
HHS also asks the industry how it could use existing CMS health and safety regulations and standards to further advance electronic exchange of information. It is ready to change Conditions of Participation (CoPs), Conditions for Coverage (CfCs), and Requirements for Participation (RfPs) for Long Term Care Facilities regulations to this effect. It wants to know whether requiring electronic exchange of medically necessary information in these regulations would move the interoperability needle.
HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place. It roundly ignores the mesh of incentives that make stakeholders unwilling to share data and patients unable to access data. The industry has cried out for less process reporting and better insight into outcomes for years. This will accomplish the former but set the industry back with respect to the latter if interoperability is declared solved based on technology alone.
Blockchain in Healthcare: Enthusiasm is growing, but still a long way to go to realize impact
The speculative craze around Bitcoin and the Initial Coin Offering (ICO) market for startups in the digital currency and blockchain space has heightened the interest in blockchain beyond the crypto-community and into the mainstream. The speculation in the ICO market has driven investment in these vehicles to reach nearly $7 billion in 2017 virtually dwarfing venture capital for startups across all sectors including healthcare. Among the hundreds of startups in the blockchain ICO count are over 70 dealing with healthcare according to industry observer Vince Kuraitis. But speculative capital rarely translates into long-term sustainability and the disruptive business models that startup founders espouse. Limitations with scalability, transaction speeds, energy consumption have been some of the dominant concerns. This is particularly true in healthcare and we have reached an important point in the evolution of the blockchain space where it is worth pausing to take stock of how blockchain applications are evolving and what specific pain points in health IT can current blockchain infrastructures realistically address.
Blockchain can be valuable in contexts where there is dependency between transactions and an asset, such as data, passes from one party to another. Furthermore, when verification of the integrity or provenance of the data is valuable the immutability and time stamping features that blockchain provides are very useful.
Chilmark Research is releasing our “Market Scan Report (MSR) on “Blockchain in Healthcare” with this goal in mind. We wanted to explore the first generation of use cases and startups in the blockchain space and provide a strong foundation for understanding the potential applications in areas that speak to the core capabilities of blockchain as they relate to payers and providers: identity management, data sharing/exchange, provider directories, patient indices, claims adjudication, supply chains and revenue cycle management. We can think of blockchain as having a role in addressing pain points in the healthcare system where we see some of the following challenges:
Contexts where there is dependency between transactions and an asset, such as data, passes from one party to another are important areas where blockchain can be valuable. Furthermore, when verification of the integrity or provenance of the data is valuable the immutability and time stamping features that blockchain provides are very useful.
Our research into the current applications in development and most-ready for prime time in the next year include companies working in the following areas of the provider-payer nexus:
We provide brief vendor profiles of consulting firms, large tech providers and startups working in the blockchain space and an analysis of where we see this market going in the coming years. These also include some of the larger consulting and tech firms offering enterprise Blockchain-as-a-Service offerings including IBM, Accenture, Deloitte and T-Systems. The startups profiled include PokitDok, Simply Vital Health, Solve.Care, MedRec, Change Healthcare, Factom. There are also alternatives to blockchain or a growing number of companies that have chosen to develop “blockchain friendly” applications until the blockchain ecosystem reaches a higher level of maturity. Google’s DeepMind Health is using an alternative distributed ledger technology with patient records in the UK’s National Health System after creating an uproar over perceived misuse of patient data without approvals or prior consent. Blockchain offers an auditable way to address this controversy. A supply chain/cybersecurity offering by Cloudface provides an analytics solution for hospital supply chain with plans to create blockchain applications in the near future. There is also growing interest in IOTA’s Tangle for the Internet-of-Things (IoT).
One interesting example that emerged at the end of our writing this report was the recently announced Optum-Quest-Humana-MultiPlan blockchain initiative to improve provider directories. This is part of an effort to streamline back-office operations for payers. The initiative was developed to address the problem that arises when claims from providers enter the system and there is a mismatch between the records payers have on providers and the provider identity. Often the payer directories are not up to date and the resulting lack of reconciliation of claims submitted by providers results in delays or non-payment of claims. The initiative will launch a pilot during the summer of 2018. What is important about this initiative is the involvement of multiple stakeholders making it a more salient use-case of blockchain for pain points resulting from lack of coordination and effective sharing of data across multiple companies. We view blockchain as a tool for industry transformation rather than disruption of companies, this is a prime example of how to begin thinking about utilizing blockchain in a transformative manner. Over the next decade, we expect to see fewer discussions of blockchain in isolation and it will be a component alongside the cloud, AI and other technologies used to automate administrative functions and enable more efficient sharing of data.
Over the next decade we expect to see fewer discussions of blockchain in isolation and it will be a component alongside the cloud, AI and other technologies used to automate administrative functions and enable more efficient sharing of data.
The example above also leads us to how Chilmark Research is beginning to think about the future of blockchain in healthcare. Blockchain is an inherently complex new technology that necessarily involves coordination across networks of stakeholders. This means that blockchain ‘disruption’ of healthcare that solves the interoperability challenge or other major pain points with a single technological quick fix is not anywhere even remotely close on the horizon, but we do believe the full impact of the transformative potential of blockchain will play out over the coming decade. 2018 will see a growing number of pilots and experimentation as the hype of 2016-17 spurred considerable interest.
Even government has gotten involved with a Blockchain Caucus in Congress. The ONC’s white paper challenge and recent Trust Exchange Framework bode well for blockchain’s future as well. Blockchain proponents would be well served in the long run by moving beyond a strict techno-centric approach and developing more robust thinking on managing consortia, governance mechanisms, and the broader cryptoeconomics of blockchain in the context of the health economy. Thinking is lagging on these fronts and the froth around easy money via ICOs in 2017 has not helped matters in developing the critical thinking tools towards long-term success for blockchain in the health IT ecosystem.
For those interested in learning more Chilmark Research will be attending the Second Annual Healthcare Blockchain Summit on June 11-12 in Boston where many of the vendors and use cases we analyze in our report will be represented.
TEFCA Response Drives ONC to Reconsider
The public response to ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA) and U.S. Core Data for Interoperability (USCDI) underscores the pent-up demand for better healthcare interoperability. Required by the 21st Century Cures Act, TEFCA and USCDI could impact the future of healthcare data portability in the same way that the EHR Incentive Program shaped EHR adoption. The goal is to catalyze better data availability needed to modernize provider and payer IT application portfolios. Taken at face value, these regulations could profoundly change the way that healthcare stakeholders provide and use each other’s data.
This isn’t ONC’s first hands-on attempt to crack the interoperability nut. Years ago, it awarded a cooperative agreement to DirectTrust to oversee the rollout of secure messaging across the industry. In return, it got a slow but steady adoption curve that has moved the needle but fallen short of providing universal data availability.
TEFCA and USCDI could impact the future of healthcare data portability in the same way that the EHR Incentive Program shaped EHR adoption.
This time around, ONC has higher hopes for a more accelerated and effective result. While the former effort relied on established technology, this proposal leans heavily on technology that does not exist – in or out of healthcare – and an architectural approach with a checkered history. The DirectTrust effort combined carrot and stick incentives, while this one offers only the potential for protection from the stick.
TEFCA proposes to join clinicians, hospitals, payers, and others in a national network of networks. It would designate a set of organizations called “Qualified HINs” (QHINs) to facilitate a standardized “on-ramp” for nationwide healthcare connectivity. A separate organization called the Recognized Coordinating Entity (RCE), to be funded by ONC through a cooperative agreement, would manage and oversee this network of QHINs. Once a user or organization is on this national on-ramp, it will be able to exchange data for three different use cases across six permissible purposes.
USCDI defines the data types eligible – or required, depending on how you read it – for exchange. It builds on the Common Clinical Data Set (CCDS) and will expand the roster of exchangeable data types eligible based on maturity and market acceptance. In the short term, it proposes to add structured and unstructured notes, and data provenance to the list of CCDS data items.
The response to this proposed regulation was not only voluminous by the standards of ONC rulemaking, but also filled with questions, concerns, requests for clarification, and conflicting interpretations. The sheer number of distinct issues raised by commenters is evidence of the complexity of the topic, the difficulty of ONC’s task, and the problems with the proposal. Healthcare stakeholders want and need to leverage existing investments, but ONC’s new plan looks more like rip-and-replace. ONC’s insistence that TEFCA is voluntary and non-regulatory is cold comfort to any organization trying to forecast how much it will cost to comply.
Before delving into the specific concerns, it’s important to note that the general approach of TEFCA and USCDI moves the conversation in the right direction. It could turn out to be less costly for providers to sign up for one network than it is to enroll in the multiplying number of purpose-built clinical and financial networks. TEFCA also emphasizes query-based exchange of discrete data elements, which represents a desirable interoperability outcome.
ONC has to be pondering its options in the wake of such a stiff and unified response. The 21st Century Cures Act specifically requires it to develop TEFCA and establish rules for data blocking. The Senate HELP committee has already sought testimony on ONC’s progress on both fronts. Besides Congress, the highest levels of the current administration want action to solve the problem of data liquidity. ONC has to act, and soon.
Prior to the public response to TEFCA, ONC was on course to award the RCE cooperative agreement to the Sequoia Project on the strength of its handling of ONC’s transition of implementation and maintenance of the eHealth Exchange, and the governance muscle Sequoia showed as evidenced by Carequality’s adoption curve. This is relatively unsurprising since few other existing organizations have the independence, HIT expertise, scale, and track record that Sequoia has assembled.
ONC also seemed destined to declare that TEFCA compliance would provide a safe harbor against data blocking liability. ONC chose not to define data blocking and TEFCA in a single set of regulations. Instead, it asked the industry to accept new “rules of the road” for information exchange without knowing what happens when an organization can’t comply with those rules. ONC did not help its case by forcing the industry to conclude that TEFCA compliance is the only way to qualify for the data blocking safe harbor. Every conversation we’ve had about TEFCA since January is primarily about this linkage. Such a requirement is especially controversial among the smaller organizations that make up the bulk of healthcare, to the extent they are even aware that such a rule could happen. That ONC thought it could deal with data blocking in two regulatory steps is hard to understand.
Our prediction is that ONC will seek another round of public comments and revise TEFCA based on what it hears. It will then release a final version late this year that preserves the overall structure of TEFCA and USCDI from the current proposal. ONC will deal with objections by adding exemptions and exceptions that will phase in some of its more stringent provisions. Most of the exemptions and exceptions will concern TEFCA’s provisions for the data blocking safe harbor and population-level query. Effectively, it will delay the operation of these provisions to some later date. It will probably go ahead and award the cooperative agreement for RCE to the Sequoia Project once TEFCA is in final form late this year.
With all eyes on TEFCA, ONC has been mum about the fate of the “EHR reporting program,” also required by the Cures Act. This program is intended help hospitals, clinicians, and other users of health information technology to better understand how these products support interoperability and usability. ONC originally claimed it could not act on this requirement, pleading lack of budget. Then Congress rejected HHS’ request for a $22 million reduction in ONC’s budget, presumably enabling ONC to develop the program.