Healthcare Provider Analytics and Reporting: Expanding Beyond VBC Use Cases

We will release our newest report, 2019 Healthcare Provider Analytics Market Trends Report, in the next few days. This report reviews the current market for provider analytics and evaluates offerings from 23 different vendors.

Key Takeaways

  • Value-based care is the dominant business driver for adoption of analytics solutions by providers.
  • Reports and dashboards are still the main way that users experience and benefit from analytics technologies.
  • Advanced analytics capabilities are seeing increased interest, but mostly from large HCOs.
  • Deriving actionable plans from the data that goes into analytics solutions remains a challenge.

In recent years, providers invested in analytics technology to support the transition from fee-for-service (FFS) to val­ue-based care (VBC). Vendor offerings that support the variety of pay-for-performance (P4P), pay-for-reporting (P4R), and risk-sharing programs with payers have helped them better understand the interaction of costs, quality, and utilization in the populations they serve. But the applica­tions for analytics are broader than just VBC. Provider healthcare organizations (HCOs) are seeking to leverage these technologies more broadly to support a range of clinical, financial, and operational performance improvement goals and programs.

Acute and Ambulatory Use Cases

Provider-oriented analytics availability mirrors EHR penetration. Providers in acute and ambulatory settings have many choices for analytics across multiple use cases. Providers in post-acute settings and others with low EHR penetration have relatively fewer choices. While vendors have devised a number of ways to extend their offerings to underserved settings, not all providers take full advantage of such capabilities.

EHR vendors are often, but not always, providers’ first choice for analytics. Most EHR vendors sell analytics offerings almost exclusively to their existing EHR customers. Independent vendors – not owned by an EHR vendor or a payer – are a strong alternative to EHR companies for value-based care use cases. Claims analytics companies have deep experience with claims data sources or rely heavily on claims-related data to fuel analytics and reporting. Applications from many of these vendors emphasize cost and utilization control and feature deeply descriptive insights into risks, costs, quality, and utilization. Providers have historically been reluctant to adopt these offerings, but that is changing.

Mainstream Analytics

This report characterizes current analytics solutions as either “mainstream” or “advanced.” Most HCOs have experience with mainstream analytics – often cloud-hosted and reliant on relational databases that store historical data from the EHR, claims, and other sources. The resulting applications characterize and summarize performance along multiple dimensions. While this technology approach is well-established, mainstream analytics still faces challenges. Chief among these are data quality and variability. Diligence is required on the part of vendors and HCOs to ensure this data is accurate, high-quality, and up-to-date.

Data complexity challenges are only increasing because new data sources are on the horizon. The All of Us program (formerly known as the Precision Medicine Initiative) promises to unleash a torrent of novel and voluminous data types. In addition, the vast trove of unstructured data in EHRs will soon contribute to a better understanding of patient cohorts and risks. Social determinants of health (SDoH), data from smart health monitoring and fitness devices, and a variety of patient-reported and publicly-available data sets are also beginning to be used in provider analytics.

Mainstream analytics has yet to supply a variety of predictive and prescriptive insights; for that, HCOs are looking at advanced analytics.

Advanced Analytics

Advanced analytics consists of interrelated technologies, the most common of which are artificial intelligence (AI)/machine learning (ML), natural language processing (NLP) and extraction, and big data technologies. These technologies and techniques are not widely deployed in healthcare, but are used to varying degrees by most of the vendors profiled in this report. The expectation is that as these technologies mature, advanced analytics will offer more and better predictive and prescriptive capabilities. Many vendors now offer optional services to help providers take better advantage of advanced analytics technologies. Increased organizational familiarity with AI technologies and algorithms should naturally increase user trust as the technologies mature and become more widespread.

Conclusion

Many provider organizations, with experience gained from their VBC efforts, want more benefits from analytics. Whether it is from their legacy point and departmental reporting solutions, mainstream, or advanced analytics, provider organizations see analytics and reporting as a reliable way to pursue performance improvement goals across their enterprises.

Stay up to the minute.

Did You Know?

Health Catalyst: Good Vision but Short of Grand

HAS18 speakerHealth Catalyst (HC) is arguably the strongest, best-of-breed vendor for clinical data analytics in the market today. And as a best-of-breed vendor, it has been able to stay one step ahead of its leading competitors, Epic and Cerner – though as former Intel CEO Andy Grove once pointed out, “only the paranoid survive”, certainly holds true for Health Catalyst as Epic and to a lesser extent Cerner, are heavily investing in their analytics solutions.

Brian Edwards and I had the pleasure to attend the recent HC Analytics Summit, providing us an opportunity to get a pulse on HC, their clients, and their strategy going forward.

Key Takeaways:

  • Health Catalyst continues to grow at a healthy pace with clients rapidly migrating to their cloud-hosted solution. Today, nearly 95% of revenue is recurring.
  • Their vision forward is to become the “Systems Integrator (SI) of Data” – a ripe opportunity that few to date have capitalized on.
  • Not all is perfect. The company’s care management solution did not meet market expectations, and their efforts in Natural Language Processing (NLP) remain nascent.
  • The company has no intention of being purchased but plans to IPO, likely in 2019 if market conditions are favorable.

The Health Analytics Summit (HAS) brought together about 1500 attendees. It was a geeky crowd of data scientists, analytics team leaders and a smattering of executives. Sessions were by and large well attended though the level of discussions in the few I sat in on was modest. Client testimonials were plentiful, most clearly showing significant savings. However, one has to question what were the sunk costs on a given project when claimed savings were a paltry ~$65k.

Health Catalyst has grown quickly with over 700 employees currently. A couple of secrets to HC’s continuing success are:

Strong focus on their employees

Unlike most vendor organizations that put their customers first, Health Catalyst goes to great lengths to ensure their employees are engaged – they are the number one priority. Their belief: an engaged employee is a happy employee. A happy employee will strive harder to make customers equally happy. I’m surprised more companies do not follow a similar strategy.

Ensuring client success

While all vendors want their customers to be successful, Health Catalyst takes it one step further by guaranteeing to clients a CFO-verifiable 2:1 return on investment (ROI). In speaking to one client, she mentioned how some in her organization have pushed back on the costs of Health Catalyst. I asked her are they seeing that expected 2:1 ROI? She gave an emphatic Yes!  – which she went on to confirm hushes critics.

A key message from CEO Dan Burton to attendees is their desire to remain an independent company that is mission-focused – “to unleash data as a catalyst of dramatic healthcare improvements.” To date, that message has resonated well across their growing provider installed base. But I wonder: Is it enough if the company plans to do an IPO?

The provider market is just getting started in understanding how to effectively use data and its insights to affect care delivery – no doubt there is still plenty of runway here. However, there are enormous opportunities outside the confines of this market. While I never wish to see HC lose sight of its mission, its long-term success, including a future IPO, will require a far grander vision that goes beyond the provider market to serve all stakeholders in the healthcare industry.

Unlocking Healthcare’s Big Data with NLP-powered Ambient and Augmented Intelligence

Key Takeaways

  • Natural Language Processing (NLP) is an increasingly low-cost, low-risk way for healthcare enterprises to experiment with machine learning and deep learning technologies.
  • HCOs can use ambient intelligence to unlock insights from the 80% of clinical data captured in an unstructured format.
  • Ambient voice technology has seen faster adoption than any other consumer technology before it, indicating potential for high rates of acceptance, utility, and efficacy in healthcare.

It wouldn’t be a radical statement to say NLP bridges the human-computer divide more than many technologies. ROI has been elusive, leaving prospective adopters reluctant to embrace it despite the numerous opportunities for NLP-driven solutions. NLP technologies have reached an inflection point with the emergence of advanced deep machine learning methods that are on-par with humans for an ever-increasing list of core natural language skills, such as speech recognition and responding to questions. In our newest report, Natural Language Processing: Enabling the Potential of a Digital Healthcare Era, we profile 12 vendors, all with a track record in text mining and speech recognition, including 3M, Artificial Intelligence in Medicine (Inspirata), Clinithink, Digital Reasoning Systems, Health Catalyst, Health Fidelity, IBM Watson Health, Linguamatics, M*Modal, Nuance, Optum and SyTrue. Each has a reputation for delivering solutions that serve a particular set of use cases or customer groups, distinctions we capture using heat maps for each company.

NLP is particularly well suited to address two huge problems in healthcare – easing the clinical documentation burden for clinicians and unlocking insights from unstructured data in EHRs. Documentation consumes an ever-increasing portion of clinician’s time. Recent research has shown physicians spend as much as half of their work day (6 hours of a 12 hour shift) in the EMR. Another recent study showed clinicians spend two hours on clinical documentation for each hour spent face-to-face with patients. Unsurprisingly it is often cited as a key factor contributing to physician burnout. Ambient Intelligence refers to passive digital environments that are sensitive to the presence of people, aware context-aware, and adaptive to the needs/routines of each end user. The familiar virtual personal assistants (VPAs), such as Amazon’s Alexa and Google’s Assistant, are familiar examples.

Speech recognition technology is approaching 99-percent accuracy, a milestone that some argue means that voice will become the primary way we interface with technology. I am skeptical of this prediction, at least when it comes to the broader utility of voice-based interfaces for consumers. The visual display, with its links and rich media, is an indispensable element of the modern digital experience.

Smart speakers, the input device for speech recognition, are the hottest technology trend of the moment, with an adoption curve that exceeds even the smartphone (see graphic below from Activate). We expect the smart speaker to rapidly become a fixture in both the home and office setting, following a similar path to maturity as the smartphone, offering applications for consumers and enterprises.

Interest and adoption in healthcare is already apparent. In September Nuance announced a smart speaker virtual assistant that uses conversational cloud-based AI (Microsoft Azure) to engage physicians during clinical documentation. In late November a post on the Google Research Blog described internal research and a pilot at Stanford investigating the potential to use a similar smart speaker interface and Automatic Speech Recognition (ASR) technology to create a virtual scribe.

Startups are taking on this problem too. Saykara, led by former executives at Nuance and Amazon, is developing a virtual assistant similar to Google’s. The company claims to have far more advanced speech recognition technology than its heavyweight competitors. Other are developing ambient scribes to passively document patient encounters, including Suki.ai , Robin Healthcare, and Notable Health.

EHR vendors are also making investments in ambient intelligence. Epic has partnered with Nuance and M*Modal to embed their ambient scribe technology directly into clinical workflows. Allscripts and athenahealth have partnered with startup NoteSwift. eClinicalWorks has launched a virtual assistant called Eva. Eva operates is initially intended to respond to queries for things like recent lab data or past clinical note content.

Barriers remain on the road to ubiquitous adoption of NLP technology by healthcare enterprises. NLP provides HCOs a low-risk opportunity to experiment with advanced machine learning and deep learning technologies, but its not the type of technology that can be implemented optimally by just any analyst in the IT department, but instead requires specialized expertise that is in short supply. While free text and mouse clicks will dominate the clinical documentation landscape in the near-term, healthcare enterprises will soon expect their users to talk their applications.

Give Us Your Data – Is it Really That Easy?

HIMSS18 Review 1 of 4

By Ken Kleinberg, Brian Murphy, and Brian Edwards

What’s inside the black box of algorithims?

During Eric Schmidt’s opening keynote at HIMSS18, he asserted that, given the state of algorithms today, it’s possible to take any large data set and make strong predictions – and healthcare is no exception. There’s no need for clinical content knowledge, rules, or past experience. His statements were met with plenty of skepticism – less about the capabilities of Alphabet and its algorithms, and more about the realities of gaining access to the right healthcare data sets. This is not trivial.

So who should get data from who? What about patient consent? Who can be trusted? Historically, health systems have done their own analytics and research within the boundaries of their own systems. Vendor analytic solutions were implemented on site. Even this limited scenario presented complex challenges – in particular, just bringing data together to a point where it could be analyzed. Transferability of models was difficult, and costs were not shared. The use of analytics was therefore sparse, mostly limited to research and quality improvement.

Bulk data access will be critical for the industry to move beyond the current artisanal methods of building and maintaining data stores for analytics purposes.

Slowly but surely, the situation is changing. The cloud is becoming much more accepted, with many options possible (private, public, hybrid). Algorithms, enabled by rapidly advancing hardware/computing power, are capable of dealing with much larger and more complex data sets. Data operating system approaches can stream data in a liquid fashion from multiple locations/sources, reducing the need for centralized repositories.

A next step as data becomes more available is to fully utilize it. Advances in natural language processing (NLP) are able to extract/mine useful features from unstructured data such as text, faxes, and reports. Algorithms can increasingly use incomplete, messy, or ill-defined data and “fill in the blanks.” At a certain scale, data quality becomes less of a factor in conducting analytics.

Despite the black-box nature of AI systems, they can still be validated using objective methods, such as how and what they were trained upon, and how they perform in real-world clinical scenarios vs. human performance.

Whose Black Box?

There is also a lot of healthy discussion about how AI systems make decisions. The primary concerns are black-box algorithms and a lack of data transparency. This even reached a point where a major educational institution recently recommended that governments not rely on any AI or algorithmic systems for “high stakes” domains, such as healthcare technology, where the way a system makes a decision cannot be understood in terms of due process, auditing, testing, and accountability standards.

Despite the black-box nature of these AI systems, the fact is that they can still be validated using objective methods, such as how and what they were trained upon, and how they perform in real-world clinical scenarios vs. human performance. As long as they are not used in closed- loop systems, and as long as there is a human expert able to accept or dismiss their recommendations, they provide valuable input (before or after) for difficult cases (such as whether surgery or therapy is the best course of action). They may also serve as a last resort with the proper consent (as with a terminal cancer patient).

So that all is highly promising – but is healthcare ready to hand this data over?

Flat FHIR and Analytics

Percolating below the surface at HIMSS was disquiet about the “bulk data transfer” proposal. This proposed method would make large sets of data (think cohort-level data) more freely available for analytics purposes. It will allow a user or program in one organization to issue a broadcast query to the country at large and receive patient data from other organizations. For example, an ACO quality manager could issue a query to a community and get all of the relevant data for patients in the ACO.

This proposal, also known by the name “Flat FHIR,” is part of the TEFCA discussion (Trusted Exchange Framework and Common Agreement) insofar as such queries are a contemplated use case. But among people who are paying attention to this proposal, there are unanswered questions.

Open-ended queries arriving from anywhere in the healthcare system are not currently part of most HCO’s IT capacity plan:

  • Will organizations end up having to add more compute and network resources to satisfy such queries?
  • Does TEFCA’s requirement to provide non-discriminatory access mean that organizations will not be able to implement reasonable network traffic and quality-of-service controls?
  • If patients have different consent profiles in different organizations, how should a query recipient satisfy the request?
  • Will organizations have to establish revenue share agreements based on pro-rata data contributions?
  • Will the fact that TEFCA puts the onus on the query receiver to reconcile medications, allergies, and problem lists mean that the receiver must verify that its data is current with proximate organizations before satisfying the original query?

Such questions represent the tip of the iceberg. As a practical matter, before the bulk data transfer proposal can ever be a day-to- day reality, many technical, non-technical, and financial questions must be resolved.

Despite these questions, bulk data access will be critical for the industry to move beyond the current artisanal methods of  building and maintaining data stores for analytics purposes. After all, analytics has more to offer than the dashboards and reports that describe the recent past. HIMSS18 was less a venue to air out the challenges associated with making bulk data transfer a reality than it was an opportunity to preview some of the advanced and predictive analytics use cases it could enable.

[bestwebsoft_contact_form]