Revisiting Our 2018 Predictions
As is our custom here, we like to look back on our predictions for the closing year and see just how well we did. Some years we do amazingly well, others we over-reach and miss on quite a few. For 2018, we got seven of our 13 predictions spot-on, two were mixed results and four predictions failed to materialize. If we were a batter in the MLB we would have gotten the MVP award with a .538 batting average. But we are not and have to accept that some years our prediction average may hover just above the midpoint as it did this year.
Stay tuned, 2019 predictions will be released in about one week and it is our hope that they will inspire both rumination and conversation.
(Note: the bigger and plain text are the original predictions we made in 2017, while the italic text is our review of 2018).
Major mergers and acquisitions that mark the end of 2017 (CVS-Aetna, Dignity Health-CHI and rumored Ascension-Providence) will spill over into 2018. Both Humana and Cigna will be in play, and one of them will be acquired or merged in 2018.
MISS – neither happened. However, Cigna did pick-up PBM service Express Scripts and rumors continue to swirl about a possible Humana-Walmart deal or more recently, even a Walgreens-Humana deal.
Hot on the health heels of CVS’ acquisition of Aetna, growth in retail health reignites, albeit off a low overall footprint. By end of 2018, retail health clinic locations will exceed 3,000 and account for ~5% of all primary care encounters; up from 1,800 and ~2%, respectively, in 2015.
MISS – Modest growth in 2018 for retail health clinics with an estimate of around ~2,100 by year’s end. Telehealth, which is seeing rapid growth and on-site clinics may be partially to blame.
In a bid to one-up Samsung’s partnership with American Well, and in a bid to establish itself as the first tech giant to disrupt healthcare delivery, Apple will acquire a DTC telehealth vendor in 2018.
MISS – Apple continues to work on the periphery of care with a focus on driving adoption of its Health Records service in the near-term with a long-term goal of patient-directed and curated longitudinal health records.
Despite investments in population health management (PHM) solutions, payers still struggle with legacy back-end systems that hinder timely delivery of actionable claims data to provider organizations. The best intentions for value-based care will flounder and 60% of ACOs will struggle to break even. ACO formation will continue to grow, albeit more slowly, to mid-single digits in 2018 to just under 1,100 nationwide (up from 923 as of March 2017).
HIT – MSSP performance data showed only 34% earned shared savings in 2017 (up from 31% in 2016) and by year’s end it is estimated there will be ~1,025 ACOs in operation.
While some of the major EHR vendors have announced support for write access sometime this year and will definitely deliver this support to their most sophisticated customers, broad-based use of write APIs will happen after 2018. HCOs will be wary about willy-nilly changes to the patient record until they see how the pioneers fare.
HIT – FHIR-based read APIs are available from all of the major EHR vendors. Write APIs are still hard to find. To be fair, HCOs as a group are not loudly demanding write APIs.
True cloud-based deployments from name brand vendors such as AWS and Azure are in the minority today. But their price-performance advantages are undeniable to HIT vendors. Cerner will begin to incent its HealtheIntent customers to cloud host on AWS. Even Epic will dip its toes in the public cloud sometime in 2018, probably with some combination of Healthy Planet, Caboodle, and/or Kit.
HIT – adoption of cloud computing platforms is accelerating quickly across the healthcare landscape for virtually all applications. Cloud-hosted analytics is seeing particularly robust growth.
Providers will continue to lag behind payers and self-insured employers in adopting condition management solutions. There are two key reasons why: In particular, CMS’s reluctance to reimburse virtual Diabetes Prevention Programs, and in general, the less than 5% uptake for the CMS chronic care management billing code. In doing so, providers risk further isolation from value-based efforts to improve outcomes while controlling costs.
HIT – Awareness of the CCM billing code (CPT code 99490) remains moderate among providers and adoption is still estimated at a paltry less than 15%.
Mobile accessibility is critical for dynamic care management, especially across the ambulatory sector. More than 75% of provider-focused care management vendors will have an integrated, proprietary mobile application for patients and caregivers by end of 2018. These mobile-enabled solutions will also facilitate collection of patient-reported outcome measures, with 50% of solutions offering this capability in 2018.
MIXED – While the majority of provider-focused care management vendors do have an integrated mobile application (proprietary or partnership), collecting PROMs is still a functionality that remains limited through an integrated mobile solution.
A wide range of engagement, PHM, EHR, and care management solutions will make progress on documenting social determinants of health, but no more than 15% of solutions in 2018 will be able to automatically alter care plan interventions based on SDoH in 2018.
HIT – despite all the hoopla in the market about the need to address SDoH in care delivery, little has been done to date to directly affect dynamic care plans.
The hard, iron core of this issue is uncertainty about its real impact. No one knows what percentage of patients or encounters are impacted when available data is rendered unavailable – intentionally or unintentionally. Data blocking definitely happens but most HCOs will rightly wonder about the federal government’s willingness to go after the blockers. The Office of the National Coordinator might actually make some rules, but there will be zero enforcement in 2018.
MIXED – Last December we said, “The hard iron core of this issue is uncertainty about its real impact.” Still true. Supposedly, rulemaking on information blocking is complete but held up in the OMB. The current administration does not believe in regulation. So “data blocking” may be defined but there was and will be no enforcement or fines this year.
Providers will pull back on aggressive plans to broadly adopt and deploy PHM solution suites, leading to lackluster growth in the PHM market of 5% to 7% in 2018. Instead, the focus will be on more narrow, specific, business-driven use cases, such as standing up an ACO. In response, provider-centric vendors will pivot to the payer market, which has a ready appetite for PHM solutions, especially those with robust clinical data management capabilities.
HIT – PHM remains a challenging market from both payment (at-risk value-based care still represents less than 5% of payments nationwide) and value (lack of clear metrics for return on investment) perspectives. All PHM vendors are now pursuing opportunities in the payer market, including EHR vendors.
This is a case where the threat of alert fatigue is preferable to the reality of report fatigue. Gaps are important, and most clinicians want to address them, but not at the cost of voluminous dashboards or reports. A single care gap that is obvious to the clinician opening a chart is worth a thousand reports or dashboards. By the end of 2018, reports and dashboards will no longer be delivered to front-line clinicians except upon request.
MISS – Reports and dashboards are alive and well across the industry and remain the primary way to inform front-line clinicians about care gaps.
Arterys, Quantitative Insights, Butterfly Network, Zebra Medical Vision, EnsoData, and iCAD all received FDA approval for their AI-based solutions in 2017. This is just the start of AI’s future impact in radiology. Pioneer approvals in 2017 — such as Quantitative Insights’ QuantX Advanced breast CADx software and Arterys’s medical imaging platform — will be joined by many more in 2018 as vendors look to leverage the powerful abilities of AI/ML to reduce labor costs and improve outcomes dependent on digital image analysis.
HIT – With about a month left in 2018 the count of FDA approved algorithms year to date is approaching 30 and could potentially hit three dozen by year end. This is a significant ramp up in the regulatory pipeline, but more is needed in the way of clear guidance on how they plan to review continuously learning systems and best practices for leveraging real-world evidence in algorithm training and validation.
What do you think of 2018 for health IT?
Matt Guldin · 2 years ago
John Moore · 3 months ago
John Moore · 2 months ago
Chilmark Team · 1 day ago
The Rise of APIs and App Stores In Healthcare
Two years ago, we published a report on the promise of open APIs in healthcare. In APIs for a Healthcare App Economy: Paths to Market Success (available as a free download), provider organizations told us that developing and using APIs was low on their list of priorities. Modern REST-style APIs were still not on the radar for most providers and payers.
Back then, HCOs large and small said they expected their EHR vendor to build an API infrastructure for them. Two years ago, only Allscripts and athenahealth offered an app store along with a comprehensive developer support program. At that time, the other EHR vendors were slow-walking FHIR support and had vague plans for app stores and developer support programs. We found that:
Since then, market conditions have continued to change. EHR vendors are now more vocally rolling out the API infrastructures that will bring healthcare into the mainstream of 21st century computing. Every major EHR vendor has delivered a variety of proprietary, HL7, FHIR, and SMART APIs along with the ability to leverage REST to improve their products.
Each of these companies sponsor, or will soon sponsor, an app store for third-party innovation. This has seen a concurrent rise in interest for using APIs within provider organizations. A recent Chilmark report, Healthcare App Stores: 2018 Status and Outlook, examines some of these platforms and the progress that has been made to date in more depth.
That said, some things have not changed. EHR data remains the most valued data resource in healthcare. All but the largest provider organizations are dependent on their EHR vendor for API enabling technologies. Small and independent developers struggle to participate in app stores and EHR developer support programs, despite great ideas for better apps to improve care delivery. And physician frustration with EHRs continues to grow.
Developing and using APIs is a priority for healthcare stakeholders who want to get more from their EHR investments as they identify opportunities for workflow improvements, real-time analytics dashboards, and more. While EHR vendors are leading the charge, our more recent research suggests that many other stakeholders hold or control access to other key data sources that could underpin such efforts.
The opportunity to capitalize on data already collected to provide advice and predictions is too big to ignore, and the easiest way to do this efficiently will be with integrated apps that can pull data from any relevant resource to provide the necessary insights at the right time.
The number of apps in EHR app stores grows monthly. To date, the idea of the potential role and benefits of an independent certification body has not entered the discussion about APIs and app stores since the collapse of Happtique’s efforts in 2013. Currently, EHR vendors certify apps for their customers based on rigorous internal evaluations, but the process varies by vendor. An independent and impartial body could do more than just provide information for prospective users. Instead, it could deliver tremendous value if its assessments were multi-pronged and supplied information about the ongoing use of the app, as well as a consistent way to think about safety, security and dependability. While a certification authority could make it easier for decision-makers, the real value could be in delivering users more information about how the app delivers value across its install base.
Sometime in the next few months, the ONC will issue new rules on information blocking and what constitutes an API that does not require “special effort” to access and use. While these actions may seem like a watershed moment for health IT, the provider market has moved perceptibly since ONC began its rulemaking. Just two years ago, providers were curious about APIs and app stores but they weren’t ready to make any commitments.
Slowly and inexorably, healthcare is embracing the downloadable app as a tool for innovation and improvement. One-size-fits-all platforms are not meeting the needs of the industry and apps can do more to assist with care provision needs than just provide supplemental functionality – to read more about opportunities for apps to have significant impact, take a look at the infographic we developed to accompany our more recent report, or read more in this deep dive post from June.
The opportunity to capitalize on data already collected to provide advice and predictions is too big to ignore, and the easiest way to do this efficiently will be with integrated apps that can pull data from any relevant resource to provide the necessary insights at the right time.
Epic Looks to the Cosmos
At Epic’s recent UGM conference in Verona, WI, CEO Judy Faulkner painted a very big vision of the future – “One Virtual System Worldwide.” She was speaking to the Epic faithful on where Epic and its customers would travel next, a place in the cosmos leading to dramatic breakthroughs in clinical science by phenotyping the de-identified EHR data of all Epic clients.
A foundational element to that virtual system is a new platform, Cosmos. Cosmos is a hosted data warehouse built on Caboodle stack and will include a hosted version of Epic’s analytics toolset, Slicer-Dicer that researchers can use to explore the data. That would be incredibly cool as today there are about 200 million patients with health data in an Epic EHR.
But there may be a tear in the Cosmos. While this is a new release, today Epic has only convinced a small handful of customers to participate. Healthcare providers, particularly large academic medical centers, may be wary of submitting their data for others to use, even for research.
Just how open that virtual system Judy speaks of is to other, competing healthcare solutions is unclear. To Epic’s credit, over the last few years, they have opened up to a significant degree – at least relative to their past walled garden approach to the market. Today, over 100 third-party apps are now available within their App Orchard and a couple of hundred more are in the wings (for more on exactly what App Orchard can and cannot do, Chilmark recently profiled it in our report on health care app stores). The company is also aggressively looking to create a store for new ML/AI models. Today, over 200 clients have developed some 500+ models that work within Epic and that number grows daily.
The company has also made pretty strong headway on the interoperability front, exchanging some 3.5 million records daily – over a third from non-Epic EHRs. Saw numerous examples in presentations by Epic clients of their use of CareEverywhere to enable care coordination across a heterogeneous EHR network, commonly found among today’s Accountable Care Organizations (ACOs).
Epic C.O.O., Carl Dvorak’s keynote was far more pragmatic focusing on how organizations can derive greater ROI from their Epic EHR through benchmarking. Claiming to have over 700 benchmarks developed to date to measure anything from clinical workflows to an analyst’s use of Epic’s analytics tools, Carl provided some good examples of how an organization can improve workforce performance. With ever tightening margins for most healthcare organizations, this message was well-received.
One of the more curious aspects of UGM this year was the near total lack of discussion on the migration to value-based care. Listening to the Epic presentations, visiting the booths of their various modules, looking over the program guide, one was struck by the sheer dearth of attention to this increasing challenge for provider organizations. A provider CEO did confide to me that during the concurrent CEO forum, this topic was discussed at length. But one has to wonder why Epic chose to seemingly ignore this issue, especially for the rank and file users of Epic.
And despite Epic’s growing openness, it did not sit well with me some of the comments Epic executives made regarding patient access to their data via APIs from third-party apps e.g., Apple’s Health app. Epic’s position is that patients don’t necessarily understand the full ramifications of their data being used by others, via various apps, which may end up in nefarious hands.
Epic – let me make that call. Who has access to my health data is my decision, not yours. Your stance harkens back to old, paternalistic modalities in healthcare that are thankfully fading.
No enterprise software vendor is perfect, Epic is no exception, but at UGM one is always struck by the devotion of its users. This comes down to culture – Epic is a company that really does want to do the right thing, though competitors and others may bristle at what that right thing may be. This is best summed up in a conversation I had with an elderly gentleman from Denmark who is the CIO of one of its regional health systems. He stated quite simply:
“I’ve spent my whole life working in enterprise software. Epic is the first company I have ever worked with that truly wants to do the right thing for the customer – they really listen to our input(s).”
That alone is testament enough as to the continuing success of Epic. Will that sense of mission to do the right thing to improve clients’ success and in turn improve healthcare delivery extend beyond Judy’s tenure? Only time will tell, but I sure hope so.
Da Vinci Project: FHIR Meets Value-based Payments
While healthcare frets and obsesses about the state of exchange between providers, payers have been relatively slow to embrace modern ideas about data movement. On the federal level, Blue Button 2.0 lets Medicare beneficiaries download or authorize the download of their own health data. This federal program has generated a lot of interest, signing up roughly 500 organizations and 700 developers at last count. Claims data on 53 million beneficiaries spanning 4.5 years is available for authorized developers and applications. Commercial payers have yet to emulate such an approach. However, there are indications that they are beginning to take notice and act.
The Da Vinci Project is a new private sector initiative to address the technical requirements for FHIR-based data exchange among participants in value-based care (VBC) programs. Its general goal is to help payers and providers with techniques and ideas for information exchange that contribute to improved clinical, quality, and cost outcomes. More specifically, it wants to facilitate the creation of use case-specific reference implementations of FHIR-oriented solutions to information exchange challenges in value-based care.
The Da Vinci Project is a relatively new project in from Health Level 7. Its founders represent organizations with experience across a range of VBC business challenges and the FHIR standard. Currently, it has the support of 27 organizations, including 11 payers, 10 health IT vendors (including 3 EHR vendors), and 6 providers.
Making coverage information REST-accessible and granular could vastly improve on the flurry of phone calls, faxes, and EDI-based document exchange that bedevil hospitals and physician offices.
Da Vinci aims to deliver implementation guides (IG) and reference software implementations for data exchange and workflows needed to support providers and payers entering and managing value-based contracts and relationships. FHIR implementation guides are roughly analogous to an IHE profile in that they define actors and actions in a defined use case. In this instance, the idea is to help providers and payers operationalize their complementary processes in value-based contracts.
Da Vinci at one time contemplated creating nine use cases. It eventually narrowed this focus to developing IGs for two that have particular VBC relevance:
Medication reconciliation is a time-consuming part of patient encounters anywhere on the care continuum. Post-discharge, this capability can reduce the incidence of adverse drug events and head off re-hospitalizations. The objective of this project is to create a simple workflow that allows providers to attest that a medication reconciliation is complete.
Coverage requirements discovery enables a provider to request payer coverage requirements in their clinical workflow. In-workflow coverage details can make point-of-care decision making about orders, treatment, procedure, or referrals more efficient. The ability to make both clinical and administrative decision mid-encounter and with a patient minimizes the non-clinical cognitive burden on providers. By reducing denials and delays administrators can be freed for other patient-specific work. As prior authorization becomes more common in value-based payment programs, providers could save time if such information were more readily available to them.
Both IGs will be balloted in HL7’s September Ballot Cycle, and Da Vinci is actively encouraging people to comment. Da Vinci team members will be on hand to help implementers interested in these two initial use cases at HL7’s September Connectathon in Baltimore, MD. It has also begun working on requirements for a third use case, Documentation Templates and Payer Rules.
While the data scope of this effort is not nearly as broad as Blue Button’s, which provides access to a patient’s claims history, it is a reasonable first step. Making coverage information REST-accessible and granular could underpin applications, vastly improving the flurry of phone calls, faxes, and EDI-based document exchange that bedevil hospitals and physician offices. Hopefully, this effort is a leading indicator of better access to commercial payer data to come for patients and providers.
Healthcare App Stores Moving to Mainstream
Update, 9/4/2018: This post was based on research conducted for the report, Healthcare App Stores: Status and Outlook, which is now available.
Apple’s early and fantastic success with its app store spawned imitators. App stores are a relatively new wrinkle in healthcare IT vendor’s approach to relationships aimed at better serving existing customers or finding new ones. Most major IT vendors to payers and providers have partnerships with other companies that provide complementary technology or services. These arrangements allow the partners to reach markets, users, or use cases that either vendor would find challenging on its own.
In the wider economy, successful app stores rely on a widely-accepted set of Web technologies. Chief among these are REST-style APIs offering programmers simple and uniform access to data and functionality across organizations and systems. REST APIs provide the data fuel that transformed consumer and enterprise apps. Most API programs are open in that documentation is available to anyone. Programmers can often use the APIs without any interaction with the API publisher. In other cases, they can use them after getting an API key, usually a simple online process while many API program sponsors monetize access at certain call volume thresholds. Health Level 7, the standards organization, created Fast Healthcare Interoperability Resources (FHIR) to enable modern, REST-style APIs that will be far less unwieldy for programmers than traditional HL7, IHE, or C-CDA APIs. The advent of FHIR-standard APIs and SMART’s use of OAuth supplies the crucial building blocks for effective app stores in healthcare.
While the challenges are daunting, the opportunity to improve healthcare is vast. Healthcare users are eager to discuss their unmet needs. Small and independent developers are eager to address these needs.
App stores facilitate the distribution of applications that add value to some base platform or brand. On the strength of the large volume and variety of data these companies collect for their HCO customers, the EHR is that platform. EHRs also have strong brand recognition among clinicians. The vast majority of HCOs lack the resources to develop their own applications, relying mostly on their EHR vendor for new and better functionality. Ironically, most EHR vendors recognize that they lack the resources or bandwidth to be the sole source for new and better functionality.
Other categories of companies could be the platform that establishes and hosts an app store. Payers and providers have the strongest brand name recognition with patients and their caregivers. Medical equipment vendors have large data volumes and established brands among healthcare workers that could provide visibility and attention from users. Companies that provide healthcare transaction or data services have amassed stupendous data volumes but remain mostly unknown to patients and are not top-of-mind for healthcare workers. In theory, any entity that holds large inventories of patient data or has a strong brand could sponsor an app store.
The technical potential already exists to access data or services from any combination of provider, payer, data aggregator, or another source. For now, the willingness to make such capabilities a reality is lacking, given that healthcare’s financial incentives almost demand that organizations hoard, monetize, or closely control the use of their data. While the challenges are daunting, the opportunity to improve healthcare is vast. Healthcare users are eager to discuss their unmet needs. Small and independent developers are eager to address these needs.
While better applications for clinicians and patients are a priority, the demand for more effective use of IT in healthcare is not limited by user category or use case. Some specific examples include:
The recent and widespread adoption of EHRs highlights the need to supplement or enhance the EHR itself. EHRs are not the only opportunity, since only a minority of the healthcare workforce uses an EHR regularly. The need to activate and engage patients with more effective applications is also important for most healthcare stakeholders. Patients have only ever known a disjointed “customer experience” with healthcare. Ideas such as convenience, price transparency, or predictability would never enter into the average patient’s expectations of interacting with any aspect of the healthcare system. But patients are beginning to expect a more conventional “consumer” experience with healthcare. The proliferation of high-deductible plans and rapidly increasing out-of-pocket costs could eventually produce a market or regulatory response that could impose a different approach and processes for consumers.
In theory, an app could discover patient data at multiple organizations using a network-based record location service’s APIs. It could access that data directly from as many organizations as the application and use case demand. It could match pertinent records using an API-accessible master patient index service. It could access workflow from yet another network-based service to support part or all of a patient encounter. This kind of orchestrated functionality using distributed data is increasingly commonplace in consumer and some enterprise apps where the application uses REST-based access to data and functionality from multiple organizations. Healthcare is years away from being able to deploy such a scenario, but many in HIT would like to see it become a reality.
We are preparing a report on the status and outlook of healthcare app stores. For now, EHR vendors have shown strong interest in investing to reach new markets, users, and use cases. Most vendors recruit third-party developers and companies into their developer programs outside of their clinician user base. Third-party experiences with EHR vendors such as Epic, Allscripts, and athenahealth have varied in terms of support, costs, and allowed functionality. This report not only inventories the ways that EHR vendors are expanding functional and organizational app coverage; it also looks at the outlook for some of the other potential app store platform hosts.
Promoting Interoperability: MU Fades to Black
Seeking to liberate the industry from its self-created morass of siloed data and duplicative quality reporting programs, the Department of Health and Human Services (HHS) issued 1,883 pages of proposed changes to Medicare and Medicaid. It renamed the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs (known by all as Meaningful Use) to Promoting Interoperability Programs (PI).
As widely reported, it would eliminate some measures that acute care hospitals must report and remove redundant measures across the five hospital quality and value-based purchasing programs. It would also reduce the reporting period to 90 days. HHS will be taking comments until June 25, 2018.
HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place.
HHS believes that requiring hospitals to use 2015 Edition CEHRT in 2019 makes sense because such a large proportion of the hospitals are “ready to use” the 2015 Edition. Ready to use is not the same as using. 2015 Edition EHRs may not be as widely deployed as HHS indicates. The following 10 month old snapshot from ONC shows hospitals have not aggressively moved to adopt 2015 Edition CEHRT.
Current adoption levels by HCOs are undoubtedly better, and many vendors have 2015 Edition technology ready to go, but hospitals can only change so fast. The rush to get hospitals on the most current edition has to do with the most relevant difference between the 2014 and 2015 Editions – the API requirement. APIs will be the technical centerpiece of better, more modern interoperability but adoptions levels are still low. APIs, by themselves, offer the promise of better data liquidity. For this promise to become a reality, healthcare stakeholders need more than just a solid set of APIs.
HHS is also proposing that hospitals post standard charges and to update that list annually.
This is a nice thought, but it will take some heavy lifting to pull this off. For starters, HHS doesn’t even have a definition of “standard charge” and is seeking stakeholder input before the final rule is published. HHS also must determine how to display standard charges to patients, how much detail about out-of-pocket costs to include (for patients covered by public and private insurance), and what noncompliance penalties are appropriate.
Above all, there’s the thorny issue of establishing what a standard charge is in the first place. Charges vary by payer. Can a hospital truly state, without a doubt, the cost of an MRI or a colonoscopy? Most cannot – and technology alone will hardly solve this problem.
The existence of APIs will stand in the stead of the old view/download/transmit (VDT) requirement. Regarded as one of meaningful use’s most troublesome and fruitless requirements, this rule has been shed by HHS because of “ongoing concern with measures which require patient action for successful attestation.”
VDT is one of several MU Stage 3 requirements pertaining to patient engagement – along with providing secure messaging or patient-specific educational resources – that HHS has proposed dropping, under the pretense that it is “burdensome” to healthcare providers. While hospitals have struggled to get many patients to participate, the VDT requirement set the bar at one patient out of an entire population. What’s more, dropping the requirements fails to take into account how burdensome it is for patients to try to access their data, communicate with their physicians, and learn about their conditions and treatment options. It is also contrary to CMS Administrator Seema Verma’s remarks, first at HIMSS18 and again this week, indicating that the agency seeks to “put patients first.”
HHS says that third-party developed apps that use APIs will deliver “more flexibility and smoother workflow from various systems than what is often found in many current patient portals.” Whether such apps deliver “smoother workflow” is not a foregone conclusion.
HHS proposes “a new scoring methodology that reduces burden and provides greater flexibility to hospitals while focusing on increased interoperability and patient access.” The proposed scoring methodology uses a 100-point system (explained over 24 pages) in which attaining a score of at least 50 means there will be no Medicare (or Medicaid) payment reduction.
HHS is also mulling whether to abandon these measures altogether in favor of scores calculated at the objective level.
The biggest regulatory effort in recent months related to interoperability, other than this proposal, has been ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA), required under the 21st Century Cures Act. TEFCA, well along in the planning stages, is a new set regulations from ONC whose goal is to catalyze better data availability using APIs. HHS in this regulation wants public comment on whether participation in a TEFCA-compliant network should replace the process measures in Health Information Exchange objective. Stated another way: Should TEFCA compliance replace 80 percent of the score for PI (75 percent in 2020)?
TEFCA is widely expected to provide a safe harbor from data blocking liability although ONC has been mum on this point. TEFCA then could do double duty: Eliminate the need to meet or report on health information exchange metrics and provide a shield from data blocking enforcement.
But there are, as yet, unanswered questions about TEFCA:
HHS is also considering doing away with Public Health and Clinical Data Exchange objective. It floated the idea that a provider that supports FHIR APIs for population-level data would not need to report on any of the measures under this objective. This would replace 90 percent of the score for PI (85 percent in 2020) when combined with the TEFCA knockout.
The specific API mentioned, called Flat FHIR and still in development, will probably contribute to part of the complex process of public health and registry reporting. This activity currently requires highly skilled data hunter-gatherers, usually with clinical credentials. In many organizations, these hunter-gatherers manually sift and collate multiple data sources to meet the varied requirements of the recipients of different registries. Flat FHIR, assuming it were production-ready, will certainly help, but it is unlikely that it could provide all, or even most, of the information needed for the range of public health reporting programs.
HHS acknowledges that providers are less than thrilled with aspects of the Quality Payment Program (QPP). HHS wants to know how PI for hospitals can better “align” with the requirements for eligible clinicians under MIPS and Advanced APMs. In particular, it wants ideas about how to reduce the reporting burden for hospital-based MIPS-eligible clinicians. It is undoubtedly looking for market-acceptable ideas to reduce the reporting burden where it is arguably more deeply felt – among non-hospital-based MIPS-eligible clinicians. While reducing or eliminating the reporting burden would help such providers, the big unanswered question, as it is with hospitals, is the burden of getting to 2015 Edition CEHRT.
HHS also asks the industry how it could use existing CMS health and safety regulations and standards to further advance electronic exchange of information. It is ready to change Conditions of Participation (CoPs), Conditions for Coverage (CfCs), and Requirements for Participation (RfPs) for Long Term Care Facilities regulations to this effect. It wants to know whether requiring electronic exchange of medically necessary information in these regulations would move the interoperability needle.
HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place. It roundly ignores the mesh of incentives that make stakeholders unwilling to share data and patients unable to access data. The industry has cried out for less process reporting and better insight into outcomes for years. This will accomplish the former but set the industry back with respect to the latter if interoperability is declared solved based on technology alone.
Blockchain in Healthcare: Enthusiasm is growing, but still a long way to go to realize impact
The speculative craze around Bitcoin and the Initial Coin Offering (ICO) market for startups in the digital currency and blockchain space has heightened the interest in blockchain beyond the crypto-community and into the mainstream. The speculation in the ICO market has driven investment in these vehicles to reach nearly $7 billion in 2017 virtually dwarfing venture capital for startups across all sectors including healthcare. Among the hundreds of startups in the blockchain ICO count are over 70 dealing with healthcare according to industry observer Vince Kuraitis. But speculative capital rarely translates into long-term sustainability and the disruptive business models that startup founders espouse. Limitations with scalability, transaction speeds, energy consumption have been some of the dominant concerns. This is particularly true in healthcare and we have reached an important point in the evolution of the blockchain space where it is worth pausing to take stock of how blockchain applications are evolving and what specific pain points in health IT can current blockchain infrastructures realistically address.
Blockchain can be valuable in contexts where there is dependency between transactions and an asset, such as data, passes from one party to another. Furthermore, when verification of the integrity or provenance of the data is valuable the immutability and time stamping features that blockchain provides are very useful.
Chilmark Research is releasing our “Market Scan Report (MSR) on “Blockchain in Healthcare” with this goal in mind. We wanted to explore the first generation of use cases and startups in the blockchain space and provide a strong foundation for understanding the potential applications in areas that speak to the core capabilities of blockchain as they relate to payers and providers: identity management, data sharing/exchange, provider directories, patient indices, claims adjudication, supply chains and revenue cycle management. We can think of blockchain as having a role in addressing pain points in the healthcare system where we see some of the following challenges:
Contexts where there is dependency between transactions and an asset, such as data, passes from one party to another are important areas where blockchain can be valuable. Furthermore, when verification of the integrity or provenance of the data is valuable the immutability and time stamping features that blockchain provides are very useful.
Our research into the current applications in development and most-ready for prime time in the next year include companies working in the following areas of the provider-payer nexus:
We provide brief vendor profiles of consulting firms, large tech providers and startups working in the blockchain space and an analysis of where we see this market going in the coming years. These also include some of the larger consulting and tech firms offering enterprise Blockchain-as-a-Service offerings including IBM, Accenture, Deloitte and T-Systems. The startups profiled include PokitDok, Simply Vital Health, Solve.Care, MedRec, Change Healthcare, Factom. There are also alternatives to blockchain or a growing number of companies that have chosen to develop “blockchain friendly” applications until the blockchain ecosystem reaches a higher level of maturity. Google’s DeepMind Health is using an alternative distributed ledger technology with patient records in the UK’s National Health System after creating an uproar over perceived misuse of patient data without approvals or prior consent. Blockchain offers an auditable way to address this controversy. A supply chain/cybersecurity offering by Cloudface provides an analytics solution for hospital supply chain with plans to create blockchain applications in the near future. There is also growing interest in IOTA’s Tangle for the Internet-of-Things (IoT).
One interesting example that emerged at the end of our writing this report was the recently announced Optum-Quest-Humana-MultiPlan blockchain initiative to improve provider directories. This is part of an effort to streamline back-office operations for payers. The initiative was developed to address the problem that arises when claims from providers enter the system and there is a mismatch between the records payers have on providers and the provider identity. Often the payer directories are not up to date and the resulting lack of reconciliation of claims submitted by providers results in delays or non-payment of claims. The initiative will launch a pilot during the summer of 2018. What is important about this initiative is the involvement of multiple stakeholders making it a more salient use-case of blockchain for pain points resulting from lack of coordination and effective sharing of data across multiple companies. We view blockchain as a tool for industry transformation rather than disruption of companies, this is a prime example of how to begin thinking about utilizing blockchain in a transformative manner. Over the next decade, we expect to see fewer discussions of blockchain in isolation and it will be a component alongside the cloud, AI and other technologies used to automate administrative functions and enable more efficient sharing of data.
Over the next decade we expect to see fewer discussions of blockchain in isolation and it will be a component alongside the cloud, AI and other technologies used to automate administrative functions and enable more efficient sharing of data.
The example above also leads us to how Chilmark Research is beginning to think about the future of blockchain in healthcare. Blockchain is an inherently complex new technology that necessarily involves coordination across networks of stakeholders. This means that blockchain ‘disruption’ of healthcare that solves the interoperability challenge or other major pain points with a single technological quick fix is not anywhere even remotely close on the horizon, but we do believe the full impact of the transformative potential of blockchain will play out over the coming decade. 2018 will see a growing number of pilots and experimentation as the hype of 2016-17 spurred considerable interest.
Even government has gotten involved with a Blockchain Caucus in Congress. The ONC’s white paper challenge and recent Trust Exchange Framework bode well for blockchain’s future as well. Blockchain proponents would be well served in the long run by moving beyond a strict techno-centric approach and developing more robust thinking on managing consortia, governance mechanisms, and the broader cryptoeconomics of blockchain in the context of the health economy. Thinking is lagging on these fronts and the froth around easy money via ICOs in 2017 has not helped matters in developing the critical thinking tools towards long-term success for blockchain in the health IT ecosystem.
For those interested in learning more Chilmark Research will be attending the Second Annual Healthcare Blockchain Summit on June 11-12 in Boston where many of the vendors and use cases we analyze in our report will be represented.
TEFCA Response Drives ONC to Reconsider
The public response to ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA) and U.S. Core Data for Interoperability (USCDI) underscores the pent-up demand for better healthcare interoperability. Required by the 21st Century Cures Act, TEFCA and USCDI could impact the future of healthcare data portability in the same way that the EHR Incentive Program shaped EHR adoption. The goal is to catalyze better data availability needed to modernize provider and payer IT application portfolios. Taken at face value, these regulations could profoundly change the way that healthcare stakeholders provide and use each other’s data.
This isn’t ONC’s first hands-on attempt to crack the interoperability nut. Years ago, it awarded a cooperative agreement to DirectTrust to oversee the rollout of secure messaging across the industry. In return, it got a slow but steady adoption curve that has moved the needle but fallen short of providing universal data availability.
TEFCA and USCDI could impact the future of healthcare data portability in the same way that the EHR Incentive Program shaped EHR adoption.
This time around, ONC has higher hopes for a more accelerated and effective result. While the former effort relied on established technology, this proposal leans heavily on technology that does not exist – in or out of healthcare – and an architectural approach with a checkered history. The DirectTrust effort combined carrot and stick incentives, while this one offers only the potential for protection from the stick.
TEFCA proposes to join clinicians, hospitals, payers, and others in a national network of networks. It would designate a set of organizations called “Qualified HINs” (QHINs) to facilitate a standardized “on-ramp” for nationwide healthcare connectivity. A separate organization called the Recognized Coordinating Entity (RCE), to be funded by ONC through a cooperative agreement, would manage and oversee this network of QHINs. Once a user or organization is on this national on-ramp, it will be able to exchange data for three different use cases across six permissible purposes.
USCDI defines the data types eligible – or required, depending on how you read it – for exchange. It builds on the Common Clinical Data Set (CCDS) and will expand the roster of exchangeable data types eligible based on maturity and market acceptance. In the short term, it proposes to add structured and unstructured notes, and data provenance to the list of CCDS data items.
The response to this proposed regulation was not only voluminous by the standards of ONC rulemaking, but also filled with questions, concerns, requests for clarification, and conflicting interpretations. The sheer number of distinct issues raised by commenters is evidence of the complexity of the topic, the difficulty of ONC’s task, and the problems with the proposal. Healthcare stakeholders want and need to leverage existing investments, but ONC’s new plan looks more like rip-and-replace. ONC’s insistence that TEFCA is voluntary and non-regulatory is cold comfort to any organization trying to forecast how much it will cost to comply.
Before delving into the specific concerns, it’s important to note that the general approach of TEFCA and USCDI moves the conversation in the right direction. It could turn out to be less costly for providers to sign up for one network than it is to enroll in the multiplying number of purpose-built clinical and financial networks. TEFCA also emphasizes query-based exchange of discrete data elements, which represents a desirable interoperability outcome.
ONC has to be pondering its options in the wake of such a stiff and unified response. The 21st Century Cures Act specifically requires it to develop TEFCA and establish rules for data blocking. The Senate HELP committee has already sought testimony on ONC’s progress on both fronts. Besides Congress, the highest levels of the current administration want action to solve the problem of data liquidity. ONC has to act, and soon.
Prior to the public response to TEFCA, ONC was on course to award the RCE cooperative agreement to the Sequoia Project on the strength of its handling of ONC’s transition of implementation and maintenance of the eHealth Exchange, and the governance muscle Sequoia showed as evidenced by Carequality’s adoption curve. This is relatively unsurprising since few other existing organizations have the independence, HIT expertise, scale, and track record that Sequoia has assembled.
ONC also seemed destined to declare that TEFCA compliance would provide a safe harbor against data blocking liability. ONC chose not to define data blocking and TEFCA in a single set of regulations. Instead, it asked the industry to accept new “rules of the road” for information exchange without knowing what happens when an organization can’t comply with those rules. ONC did not help its case by forcing the industry to conclude that TEFCA compliance is the only way to qualify for the data blocking safe harbor. Every conversation we’ve had about TEFCA since January is primarily about this linkage. Such a requirement is especially controversial among the smaller organizations that make up the bulk of healthcare, to the extent they are even aware that such a rule could happen. That ONC thought it could deal with data blocking in two regulatory steps is hard to understand.
Our prediction is that ONC will seek another round of public comments and revise TEFCA based on what it hears. It will then release a final version late this year that preserves the overall structure of TEFCA and USCDI from the current proposal. ONC will deal with objections by adding exemptions and exceptions that will phase in some of its more stringent provisions. Most of the exemptions and exceptions will concern TEFCA’s provisions for the data blocking safe harbor and population-level query. Effectively, it will delay the operation of these provisions to some later date. It will probably go ahead and award the cooperative agreement for RCE to the Sequoia Project once TEFCA is in final form late this year.
With all eyes on TEFCA, ONC has been mum about the fate of the “EHR reporting program,” also required by the Cures Act. This program is intended help hospitals, clinicians, and other users of health information technology to better understand how these products support interoperability and usability. ONC originally claimed it could not act on this requirement, pleading lack of budget. Then Congress rejected HHS’ request for a $22 million reduction in ONC’s budget, presumably enabling ONC to develop the program.