Signaling the Tipping Point of AI Value in Healthcare
Top minds across a broad spectrum of healthcare disciplines converged to discuss artificial intelligence in Boston April 23-25 at the 2018 World Medical Innovation Forum (WMIF), which drew 1700 registrants and 169 speakers. Researchers, clinicians and industry leadership paraded on stage for a collection of panels and “1:1 fireside chats” to discuss healthcare applications of AI and related technologies, their progress, roadblocks and the promise of future value. The majority of provider participants were from Harvard teaching hospitals as Partners Healthcare runs the conference.
The healthcare innovation community is always looking for applications to illustrate the potential of AI, but also to grasp a tangible that is closer to the now rather than solely as a futuristic vision. At this point, there are many diverse applications for AI and related technologies, as highlighted in the framing of the conference, which opened with a discussion of 19 “AI breakthroughs” and closed with the “disruptive dozen” (scroll to the end of the post for listing). It was encouraging that these innovative pursuits were not limited to any small facet of care or research, but instead cover a broad spectrum of healthcare challenges.
Across the rest of the conference were overlapping discussions of the impediments and implications of advances in AI and the many applications. These AI-centric discussions can be organized into 5 themes that broadly mirror key concerns of the healthcare industry as a whole, indicative of both the hype and potential for these technologies to fundamentally change our methods for care delivery:
Unsurprisingly, the importance of good sources of high-quality data was a core theme across all panels. In some cases, the data already exists but has not yet been analyzed. Drug manufacturers sitting on mountains of clinical trial data being a prime example. New data sources like that those coming from companies like Flatiron – recently acquired by Roche – provide a fresh resource for research. All of this data, however, exacerbates challenges of large data analysis. In order to run AI and machine learning systems, the data needs to be clean. Surveys have indicated that data scientists spend 60% of their time cleaning data. The challenge before everyone includes sourcing data (as clean as possible), cleaning and making data usable for tools.
In many cases, the limiting reagent to progress is our understanding of biology – as much as or even more so than the technology. Technology, in fact, is the means to understand the biology further. Life sciences companies like Novartis, Pfizer and others are actively using, in their words, “higher order data,” including ‘omic data sources. In doing so, they hope to reach breakthroughs faster by redefining how they consider new drug discovery.
William Lane, MD, PhD delivered one of the 19 breakthroughs that opened the conference, bloodTyper, which offers a new outlook for the long-standing method of categorizing patient blood types based on the presence or absence of 2 antibodies and 2 antigens. bloodTyper uses DNA-based categorization as opposed to serologic testing alone. Whole genome sequencing costs have declined dramatically in recent years and are expected to continue to decline making this solution viable for widespread clinical adoption. Technology enables this effort, but much research needs to be done in order to illustrate specific value and bring these new methods into practice. With continued advancement of technological capability, understanding the biology of health and/or disease will continue to be a primary obstacle.
One of the consistent themes across panel discussions was the notion that medical technology, AI in particular, is futuristic, while our care system is stone-aged. The challenge of adoption and change to the care paradigm is not a limitation of technology. Panelists repeatedly remarked about how the pairing of “Jetsons technology and a Flintstones care system” would take substantial time to evolve, because change requires evidence and trust that are not ascertained lightly.
In some cases, however, evidence-based change is already upon us. A group at MGH, represented at WMIF by Erica Shenoy, MD, PhD, is using Machine Learning to more quickly identify cases of hospital-acquired infection, identifying C Diff cases full days before the currently accepted 5-day standard process. We expect to see substantial growth in research publication volume illustrating the value of AI and Machine Learning technologies in areas like this.
On the other side of the curtain, AI is changing the way that industry operates. Drug development, in particular, is undergoing a revolution of sorts. Companies like Exscientia, with diverse data science capabilities, provide life sciences partners with a way to look across different data types including ‘omic data, research text and other sources. This allows manufacturers to potentially repurpose molecules and sub-organize disease for more precise targeting.
Not all areas of healthcare are ready environments, but the ubiquity of efforts to utilize AI technologies to accelerate processes, improve accuracy, increase access, increase bandwidth and offer precise care points to a tipping point.
Another recurring theme was the impact of AI on the workforce. As in other healthcare sub-verticals, there is a large and increasing demand for data experts. Radiologists and pathologists across the conference echoed the surprisingly optimistic resolutions of a panel dedicated to this topic both with respect to the increased demand for data science professionals in all organizations as well as the potential obviation of some roles.
The future will likely position these data scientists alongside healthcare professionals as part of multidisciplinary/cross-functional teams in care and non-care settings alike. There is also an expectation that data science literacy, at least at a high level, will become a core component of education for many healthcare professionals, not just IT specialists.
The fear of the obviation of certain roles was interestingly framed. At a very high level, the introduction of AI and related technologies will exacerbate the division between the highly educated and the less educated, as Glenn Cohen of Harvard Law School eloquently pointed out. It was also highlighted, however, that a lot of the lower skilled manual roles have already been dramatically reduced with the introduction of EMR systems. The natural targets often discussed beyond lesser skilled workforce are experts in clinical disciplines of radiology and pathology. In both contexts, AI technologies have been shown to approximate or even beat the accuracy of some clinical practice by human experts. There is also a global shortage of these types of experts. The prevailing opinion expressed was that new technologies will be used to make these professionals more efficient and effective, and it is unlikely that technology will be used to truly replace the human element.
The burden of AI technologies on regulators is substantial. They need to be able to evaluate, audit and assure quality of these new technical capabilities. Linguamatics is one example of the many companies that are making progress in crafting technology solutions for healthcare while also targeting the FDA as a customer. Providing a mechanism for auditability of otherwise “black box” AI systems is a great benefit to the regulators.
Standardization is a key to enable scalability and support industry-wide progress. Consider DICOM, for example, which offers a standard for many imaging modalities. The expectation has been set that at least some standardization must be a focus of innovators so as not to run into an even greater challenge of interoperability with more complex technology. Building gold standard training data sets was a solution highlighted that begins to address this challenge with AI.
Precision medicine was an end target for many of the applications and topics of discussion at the conference. The development of a clearer, richer phenotype and genotype (or other “’omic” and new forms of data) essentially yields the potential for a digital twin for each individual patient. To accompany this, the development of precision drugs, diagnostic capabilities and other therapies is the end promise of many of the applications of AI, machine learning and other technologies.
We will only be able to deliver on these promises if our data, technology and systems for the delivery of care are able to adapt. To get there though, we need to establish trust and confidence across the healthcare ecosystem and for patients with AI and its elevated role.
Currently, there are little islands of the right components to make great progress and work is well underway. In oncology, for example, there is data access, a willingness and need from physicians and patients, a motivated industry (opportunity for profit or strategic position), research capacity and funding, payment and regulatory feasibility. At WMIF, Atul Gawande described these environments as “ready environments,” in that they are capable and motivated. The historical example he gave to juxtapose “ready” and “not ready” was the spread of anesthetized surgery over weeks or months, vs. antisepsis, which he estimated to take 20 years to spread.
Not all areas of healthcare are ready environments, but the ubiquity of efforts to utilize AI technologies to increase accelerate processes, improve accuracy, increase access, increase bandwidth and offer precise care points to a tipping point. Within the next year, I expect to see a thicker volume of applications and more importantly more examples of impact.
Matt Guldin · 2 years ago
Liz Gavriel · 4 years ago
John Moore · 3 months ago
Matt Guldin · 3 months ago
What’s All the Fuss – Some Thoughts on Recent News
At times it can be challenging to draft commentary on all that is happening across this industry sector. Rather than write short posts for each, I have created an amalgamation of commentary to some of the more newsworthy announcements.
Wow, whoever knew that data could be such a valuable resource? Roche’s total spend to acquire Flatiron Health, a company focusing on the oncology space, was an eye-popping $2.1B. At first, I just could not fathom why anyone would spend that much for a relatively young company, that despite receiving a lot of VC funding early on, had little to show other than acquiring a modest oncology EHR.
Digging deeper however I learned that Flatiron was taking all that oncology data being collected in their EHR at physician offices across the country and cleansing and normalizing the data for clinical research purposes. Clean, normalized data is hard to come by in this industry and near impossible in oncology. Upon reflection, it now appears that Roche made an incredibly savvy move, one that will reap a handsome, long-term return on investment for the company.
This acquisition by Veritas is a tough one to understand. Over one billion dollars cash for assets that are dated and fading from the market? Granted, there is that installed base, there is that maintenance revenue to leverage and if you strip out virtually all SG&A costs you can make some money here, but is it really worth the trouble?
Veritas’s acquisition of Thomson-Reuters healthcare business that became Truven and later sold to IBM for roughly 2x what they paid shows that Veritas may know what they are doing. Maybe combining these GE assets with Verscend (formerly Verisk Health), also under Veritas, creates a 1+1=3 scenario but right now, just don’t see it.
Attended my first American College of Healthcare Executives (ACHE) Congress two weeks ago. This is a very collegial event – warm and welcoming. Everyone is there to learn from one another through various educational sessions and seminars. It is also an event where I was a bit floored and probably under-dressed as virtually everyone was in suits and ties.
I attended several sessions, mostly on IT and innovation, to get a feel for how these senior-level executives think about these issues. Came away with a feeling that most really do not see what is coming. Along with all of those suits, one walks away with the impression that there is a certain level of calcification across this audience. Sadly, many will likely become the detritus of the digital train that will run right over them.
Have been taken aback by all the fuss being made about Apple’s recent announcements regarding its Health Record app. From the Twitterverse, to a wide range of trade mags, to blog posts, folks are making this app seem like the second coming – that this signals Apple’s ability to disrupt the industry.
Hold on folks.
While I certainly applaud Apple’s efforts and for that minority of the population using an Apple iOS device, this may be just what they are looking for, I can’t help but feel a deja vu moment.
Were not Google Health and MSFT’s Healthvault going to do the same thing – revolutionize healthcare, put patient’s health records into their control. We all know where that ended – in the dustbin of history.
I’ll stay cautiously optimistic, but will reserve excitement until that day when Android devices also have the same capability with both clinicians and citizens warmly embracing and using this functionality for their care and the care of loved ones.