Signaling the Tipping Point of AI Value in Healthcare
Top minds across a broad spectrum of healthcare disciplines converged to discuss artificial intelligence in Boston April 23-25 at the 2018 World Medical Innovation Forum (WMIF), which drew 1700 registrants and 169 speakers. Researchers, clinicians and industry leadership paraded on stage for a collection of panels and “1:1 fireside chats” to discuss healthcare applications of AI and related technologies, their progress, roadblocks and the promise of future value. The majority of provider participants were from Harvard teaching hospitals as Partners Healthcare runs the conference.
The healthcare innovation community is always looking for applications to illustrate the potential of AI, but also to grasp a tangible that is closer to the now rather than solely as a futuristic vision. At this point, there are many diverse applications for AI and related technologies, as highlighted in the framing of the conference, which opened with a discussion of 19 “AI breakthroughs” and closed with the “disruptive dozen” (scroll to the end of the post for listing). It was encouraging that these innovative pursuits were not limited to any small facet of care or research, but instead cover a broad spectrum of healthcare challenges.
Across the rest of the conference were overlapping discussions of the impediments and implications of advances in AI and the many applications. These AI-centric discussions can be organized into 5 themes that broadly mirror key concerns of the healthcare industry as a whole, indicative of both the hype and potential for these technologies to fundamentally change our methods for care delivery:
Unsurprisingly, the importance of good sources of high-quality data was a core theme across all panels. In some cases, the data already exists but has not yet been analyzed. Drug manufacturers sitting on mountains of clinical trial data being a prime example. New data sources like that those coming from companies like Flatiron – recently acquired by Roche – provide a fresh resource for research. All of this data, however, exacerbates challenges of large data analysis. In order to run AI and machine learning systems, the data needs to be clean. Surveys have indicated that data scientists spend 60% of their time cleaning data. The challenge before everyone includes sourcing data (as clean as possible), cleaning and making data usable for tools.
In many cases, the limiting reagent to progress is our understanding of biology – as much as or even more so than the technology. Technology, in fact, is the means to understand the biology further. Life sciences companies like Novartis, Pfizer and others are actively using, in their words, “higher order data,” including ‘omic data sources. In doing so, they hope to reach breakthroughs faster by redefining how they consider new drug discovery.
William Lane, MD, PhD delivered one of the 19 breakthroughs that opened the conference, bloodTyper, which offers a new outlook for the long-standing method of categorizing patient blood types based on the presence or absence of 2 antibodies and 2 antigens. bloodTyper uses DNA-based categorization as opposed to serologic testing alone. Whole genome sequencing costs have declined dramatically in recent years and are expected to continue to decline making this solution viable for widespread clinical adoption. Technology enables this effort, but much research needs to be done in order to illustrate specific value and bring these new methods into practice. With continued advancement of technological capability, understanding the biology of health and/or disease will continue to be a primary obstacle.
One of the consistent themes across panel discussions was the notion that medical technology, AI in particular, is futuristic, while our care system is stone-aged. The challenge of adoption and change to the care paradigm is not a limitation of technology. Panelists repeatedly remarked about how the pairing of “Jetsons technology and a Flintstones care system” would take substantial time to evolve, because change requires evidence and trust that are not ascertained lightly.
In some cases, however, evidence-based change is already upon us. A group at MGH, represented at WMIF by Erica Shenoy, MD, PhD, is using Machine Learning to more quickly identify cases of hospital-acquired infection, identifying C Diff cases full days before the currently accepted 5-day standard process. We expect to see substantial growth in research publication volume illustrating the value of AI and Machine Learning technologies in areas like this.
On the other side of the curtain, AI is changing the way that industry operates. Drug development, in particular, is undergoing a revolution of sorts. Companies like Exscientia, with diverse data science capabilities, provide life sciences partners with a way to look across different data types including ‘omic data, research text and other sources. This allows manufacturers to potentially repurpose molecules and sub-organize disease for more precise targeting.
Not all areas of healthcare are ready environments, but the ubiquity of efforts to utilize AI technologies to accelerate processes, improve accuracy, increase access, increase bandwidth and offer precise care points to a tipping point.
Another recurring theme was the impact of AI on the workforce. As in other healthcare sub-verticals, there is a large and increasing demand for data experts. Radiologists and pathologists across the conference echoed the surprisingly optimistic resolutions of a panel dedicated to this topic both with respect to the increased demand for data science professionals in all organizations as well as the potential obviation of some roles.
The future will likely position these data scientists alongside healthcare professionals as part of multidisciplinary/cross-functional teams in care and non-care settings alike. There is also an expectation that data science literacy, at least at a high level, will become a core component of education for many healthcare professionals, not just IT specialists.
The fear of the obviation of certain roles was interestingly framed. At a very high level, the introduction of AI and related technologies will exacerbate the division between the highly educated and the less educated, as Glenn Cohen of Harvard Law School eloquently pointed out. It was also highlighted, however, that a lot of the lower skilled manual roles have already been dramatically reduced with the introduction of EMR systems. The natural targets often discussed beyond lesser skilled workforce are experts in clinical disciplines of radiology and pathology. In both contexts, AI technologies have been shown to approximate or even beat the accuracy of some clinical practice by human experts. There is also a global shortage of these types of experts. The prevailing opinion expressed was that new technologies will be used to make these professionals more efficient and effective, and it is unlikely that technology will be used to truly replace the human element.
The burden of AI technologies on regulators is substantial. They need to be able to evaluate, audit and assure quality of these new technical capabilities. Linguamatics is one example of the many companies that are making progress in crafting technology solutions for healthcare while also targeting the FDA as a customer. Providing a mechanism for auditability of otherwise “black box” AI systems is a great benefit to the regulators.
Standardization is a key to enable scalability and support industry-wide progress. Consider DICOM, for example, which offers a standard for many imaging modalities. The expectation has been set that at least some standardization must be a focus of innovators so as not to run into an even greater challenge of interoperability with more complex technology. Building gold standard training data sets was a solution highlighted that begins to address this challenge with AI.
Precision medicine was an end target for many of the applications and topics of discussion at the conference. The development of a clearer, richer phenotype and genotype (or other “’omic” and new forms of data) essentially yields the potential for a digital twin for each individual patient. To accompany this, the development of precision drugs, diagnostic capabilities and other therapies is the end promise of many of the applications of AI, machine learning and other technologies.
We will only be able to deliver on these promises if our data, technology and systems for the delivery of care are able to adapt. To get there though, we need to establish trust and confidence across the healthcare ecosystem and for patients with AI and its elevated role.
Currently, there are little islands of the right components to make great progress and work is well underway. In oncology, for example, there is data access, a willingness and need from physicians and patients, a motivated industry (opportunity for profit or strategic position), research capacity and funding, payment and regulatory feasibility. At WMIF, Atul Gawande described these environments as “ready environments,” in that they are capable and motivated. The historical example he gave to juxtapose “ready” and “not ready” was the spread of anesthetized surgery over weeks or months, vs. antisepsis, which he estimated to take 20 years to spread.
Not all areas of healthcare are ready environments, but the ubiquity of efforts to utilize AI technologies to increase accelerate processes, improve accuracy, increase access, increase bandwidth and offer precise care points to a tipping point. Within the next year, I expect to see a thicker volume of applications and more importantly more examples of impact.
Matt Guldin · 2 years ago
Liz Gavriel · 4 years ago
John Moore · 3 months ago
Matt Guldin · 3 months ago
One More Step in the Long Road of Precision Medicine
For any new therapy, diagnostic or device brought forth by our healthcare innovation community, there are three high-level barriers generally encountered on the path to commercialization: Regulatory approval, payment confirmation (generally coverage by public and/or private healthcare payers) and adoption by healthcare providers. For new classes of therapy, such as genetically targeted therapies and their companion diagnostics, there is often a greater challenge to pass regulation, assure coverage and gain adoption since there is little precedent.
As of mid-March, there is new precedent to leverage for gene-based diagnostics and all stakeholders in the development of genomics applications in medicine. Following the November 2017 approval by the FDA of Foundation Medicine‘s FoundationOne CDx, (F1CDxTM), the Centers for Medicare & Medicaid Services (CMS) proposed a National Coverage Determination (NCD) for diagnostic lab tests that include Next Generation Sequencing (NGS). These first steps were the culmination of a great deal of work by industry players, researchers and regulators. On March 16, 2018, CMS announced a finalized NCD for NGS for Medicare patients with advanced cancer (including Stage III, Stage IV, recurrent, relapsed, refractory or metastatic cancers). These are diagnostic tests that, as companions to other diagnostics, identify treatment options based on certain genetic mutations.
As policymakers and payers take on the burden of cost coverage, the progression of the healthcare sub-industries focused on leveraging patient’s genetic and other “-omic data” will benefit from the step toward better coverage.
The burden of payment for genetic sequencing was a topic of discussion at HIMSS18 among players in the space of gene-based therapy (HIT, providers, etc.). Prior to the CMS coverage decision, patients often had only the option to pay out of pocket for genetic sequencing. Based on this NCD, Medicare patients with advanced cancer have coverage. That coverage will be limited to FDA approved diagnostics, such as F1CDxTM, but the test results may be used both to match patients with FDA-approved gene based therapies and to identify patient candidacy for clinical trials of therapies not yet approved by the FDA. This potentially charts a clearer, more predictable path for additional NGS diagnostics in development, not only because of payment and regulatory precedent, but importantly because of the potential to speed up clinical trials for gene based therapies if candidates are identified more quickly.
Patients diagnosed with cancer, or really any life-threatening condition, want and deserve access to the latest proven advancements in medicine. This NCD marks a big step in patient access and for development of targeted therapies and companion diagnostics. It also brings stakeholders attention to the looming challenge of payment at a systemic level. This remains a primary focus of the discussion among payers and policy makers.
CMS Administrator Seema Verma and other high-ranking Government officials have discussed their intentions to curb costs for Medicare and Medicaid specifically related to novel genetically targeted therapies because they come at notably high cost. Therapies of this type can be priced between $300,000 and $500,000, with some reaching as high as $1 million. CMS does not negotiate prices, so its efforts to reduce the cost burden are focused on alterations to the format of payment for state agencies and managed care organizations who do. Some concepts floated by officials include paying less for a given drug based on the target indication used with a patient, or paying for high-cost drugs over a longer period of time. The CMS final NCD for genetic sequencing diagnostics only further brings this cost challenge to the forefront.
As policymakers and payers take on the burden of cost coverage, the progression of the healthcare sub-industries focused on leveraging patient’s genetic and other “-omic data” will benefit from the step toward better coverage. However slow and bumpy the progress may seem, expect to see continued or accelerated investment in diagnostics and therapy by both public research sources as well as private equity.
As these areas of investment continue, HIT vendors will have an opportunity to differentiate. Cancer in particular offers a slightly more carved out business channel for vendors to target with specialized solutions and a big market to warrant the investment. Cancer patients often have large care teams to manage, often have greater needs to make contact with the care team or show up for therapy and have a lot of test results to manage. EHR systems, telehealth companies, care management, risk based business models and other subsets of HIT all have an opportunity for differentiation within this specialized care community.
Vendors such as Flatiron, recently acquired by Roche for $1.9 Billion, Syapse, 2bPrecise, Orion Health and others have taken early focused steps both with respect to “Precision Medicine” and to advancements in oncology care (as the CMS NCD specifically pertains to). Healthcare IT vendors, with this NCD, have yet another signal to consider the role of genomic and other comparable complex data types in their systems.
Here are a few specific applications to keep an eye on related to this evolution:
As NGS data becomes more readily available and expected as a component of care, analysis and facilitating utility of these complex forms of data will be an opportunity for competitive advantage.
Top 7 Things to Look for at HIMSS17
The final countdown has begun. In a few short days I and the rest of the Chilmark Research team will make our annual pilgrimage to the big health IT confab, HIMSS17, to rub shoulders with some 45,000 of our closest friends.
I have a love-hate relationship with this event. I love the opportunity to meet with many leading advocates, innovators, developers, and users of IT who are all truly trying to improve the delivery of care – to improve the patient experience. This is my/our tribe. These individuals and even organizations are who we as analysts seek out, looking to have an in-depth conversation. These conversations are enlightening, help us further refine our research agenda, and provide us the opportunity to accurately report on exactly what is working and, frankly, what may still be more vaporware than software.
HIMSS is not all sweet-smelling roses. What frustrates me the most is the hype. Now I know that vendors pay a pretty penny to exhibit their wares at HIMSS and, having been on the other side of the fence, I know intimately the challenges of trying to differentiate yourself from all the others that surround you. But what I truly hate is when the latest fad or buzzword enters the hype-cycle and every single vendor claims to have that solution, to address that buzzword du jour.
Years ago I remember walking by booth after booth of vendors claiming they had a Health Information Exchange (HIE) solution. When I saw such proudly displayed in the Dell booth, I knew it was all BS and decided to separate the wheat from the chaff with our first report on the HIE market. That report was a huge success and really put wind in the sales of what was then a very small company. In recent years it has been population health management (PHM), a misnomer if there ever was one. Until the last 12-18 months, not a single vendor has been capable of fully supporting an organization’s PHM strategy. There are simply too many moving parts to PHM. Simplistic care gap analysis with robo-emails and calls is not PHM. The sad thing is, these buzzwords get so over-used, so misconstrued, and so abused that they become meaningless – I’m coming close to detesting the term PHM.
Enough of a preamble. The following is what I will be on the lookout for at HIMSS – and you should, too.
1. Artificial intelligence (AI) is big, everywhere, but are we truly seeing traction? This term, along with machine learning and cognitive computing, will be on prominent display at HIMSS17. I would not be surprised if this is the top buzzword at HIMSS this year, but what I really want to know is how AI is actually being used. What are the use cases? Where will it scale first (e.g., consumer, clinician, back-office, radiology…)? What impact will it have on existing processes and workflows? How will it impact staffing levels?
2. Precision medicine: Is it real or still theoretical? There’s been a lot of hype on precision medicine the last couple of years but little action. Will be interested to learn how much traction some of the early innovators are getting (NantHealth) and what new entrants (2bPrecise) are planning. Tremendous amount of opportunity here that overlaps some of the current work being done in AI – how will these two come together in the future to improve and optimize the delivery of care for the individual?
3. How will care management and patient engagement merge – who is thinking beyond silos here? A core tenet to effective care management is patient activation and self-care, yet care management and patient engagement solutions remain by and large completely divorced from one another. Who among vendors, providers, and consultants is truly thinking beyond silos and looking to effectively meld care management and patient engagement?
4. Who is using IoT and PGHD at scale? Despite all the talk of health Internet of things (IoT) and patient generated health data (PGHD), our research to date has yet to find an organization that is using IoT and PGHD at scale for remote care monitoring. Sure, there are plenty of pilots, but as one healthcare executive said to me: Its one thing to do such for 100 patients, quite another for 10,000 – we’ve yet to figure out how to scale the workflow across the enterprise and community.
5. Outcome measures, anyone? The focus of quality initiatives (and measures) for years has been on process measures. We are getting to a level of maturity in the industry’s use of IT that we need to be thinking beyond process measures to outcome measures. In a recent briefing, a leading HIT vendor proudly showed how its clients excel at meeting process measures. When asked how these same clients do on outcome measures – well, that was a question the company had no answer for. It’s time we start figuring this out. I want to know who is doing this on a routine basis and for what use cases.
6. How are vendors looking to support PHM and VBC? Core to our research in 2017 is gaining a better understanding of provider-payer convergence. Our thesis is that providers have a core competency in population health management (PHM), whereas payers have competency in value-based care (VBC). As these two entities increasingly collaborate to improve access and care delivery within a community, how will vendors provide a platform to support such requirements?
7. What’s next? a.k.a., reading the Trump administration tea leaves and what may unfold across the healthcare sector over the next few years. The new administration has made a lot of pronouncements pertaining to healthcare, from pharmaceutical pricing to repealing the Affordable Care Act. We still do not know how all this will play out, and that lack of clarity is likely to impact budgets. I want to know: How big is that impact and how might it impact IT spend going forward? While I can already imagine every vendor telling me that it has a great pipeline, a great backlog, etc., I remain unconvinced. I’ll be digging deep on this one.
Bridging Genomics-Health IT Gap for Precision Medicine
Since the White House launched its Precision Medicine initiative in January 2015 there has been a great deal of buzz about personalized or precision medicine and the future of healthcare. “Personalized medicine” is an older term and is gradually falling by the wayside as critics think that it denotes a focus on the individual whereas precision medicine is more focused on which treatments work best for patients with a specific genetic, lifestyle or environmental context. The latter is more appropriate in the context of digital health where the growth in wearables, mHealth and even Population Health Management have become part of precision medicine initiatives.
But once we settle upon a definition, what are the real challenges to making precision medicine a reality? Given the significant challenges associated with EHR implementations as well as interoperability challenges across a given community, how will healthcare begin to address an additional stakeholder whose data sets are much larger and bring these insights into the clinic in an actionable way?
There are very few, if any, EHR examples today that have the capability of integrating, in a systematic way, genetic data in a format that can be readily used for treatment and therapeutic practice. This raises a number of important questions on when precision medicine can become reality in the clinic and what kind of strategic road map can be put into place to address the health IT issues?