My post before HIMSS talked about how jazzed (not jaded) I was to be attending my 20th HIMSS conference. Now that HIMSS18 is in the bag – what did I learn?
ML and AI
On Monday I presented the results of our AI survey at the Machine Learning & AI for Healthcare preconference event at the Wynn. Although there were a few hundred seats, the event sold out weeks in advance. A number of folks I knew who planned to buy a ticket at the door were shut out. So that’s a lesson – when it comes to attending hot topics, don’t procrastinate!
Keynote speaker Lynda Chin from the University of Texas compared using of AI to having a paralegal on your team – it’s someone intelligent that could pull resources together to help you make better decisions. She summed it up simply: “Machines serving humans, not humans serving machines.”
Many other speakers came from large health systems and spoke to important use cases:
- Kaiser Permanente: Colorectal cancer detection 1-2 years earlier with AI.
- UPMC: Game-changing pediatric readmissions prediction.
- Duke Institute for Health Innovation: Implementation of AI with thousands of input features.
- Stanford Health Care: Working with “small” vs “big” data.
- Brigham & Women’s: Costs of short-time cancellations vs no-shows.
It’s become a given that these leaders and their vendors use AI and use it well. My favorite from the above examples was Srinivasan Suresh, CMIO at Children’s Hospital of Pittsburgh of UPMC. His slide highlighted that, although he had no impressive AI or ML credentials, he was still able to use these kinds of tools successfully to predict pediatric readmissions due to seizures, asthma, and pneumonia, which led to more effective interventions.
HIMSS and health IT may be more of a cacophony than a symphony, but I’m glad to be in the orchestra.
AI and the cloud were key themes this year and have become mainstream topics. For our views on Eric Schmidt’s keynote about data, analytics, and AI, see our earlier HIMSS18 recap blog.
Glad I had teammates that made it into Seema Verma’s CMS keynote the next day – her announcement about patient data access, open APIs, and Blue Button 2.0 was welcome. You may recall the previous year, given the change of administration, there was little that CMS or ONC could say about anything. Although we’re seeing some progress, it doesn’t seem substantial enough to move the needle on value-based care.
Natural Language Processing (NLP)
A big part of my week was meeting with NLP vendors. Chilmark Research is close to releasing our major report on this topic, and it was great to get insights from more than a dozen vendors. Some of the smaller ones are highly focused on specific use cases (Health Fidelity and Talix on risk stratification; Clinithink on matching patients to clinical trials). 3M and its partnership with Alphabet’s Verily are a powerful combination on determining the “dominoes” of costs and care. Also of note: M*Modal’s virtual provider assistant and use of ambient devices, as well as Nuance’s partnership with Epic to add more conversational AI functionality. We are seeing voice assistant success paving the way to virtual scribes – those that can “whisper” in the physician’s ear will be most valuable to ensure that decision support is not bypassed by passive systems.
As John Moore posted in his earlier HIMSS18 recap, it’s sad (well, infuriating) that we still have to address interoperability. I attended two events held by the Strategic Health Information Exchange Collaborative (SHIEC), which has been successful in providing a rallying point for 60 HIEs and 40 vendors to share knowledge and provide comments to ONC regarding TEFCA and data exchange. But it only represents a fraction of the hundreds of private and public HIEs in the country, so there is still a long road ahead. A payer committee was a welcome sign that convergence was part of their agenda.
At the opposite end of the interoperability spectrum, I attended a session by Houston Methodist on body sensors, where the distances are measured in inches and the signals are often so weak that temperature or motion (such as a kicking baby) are enough to throw them off. Sensor network fusion is the frontier – the more information you can capture from more places with more context, the better. For example, one of Methodist’s use cases was rapidly predicting a patient fall.
I also met with Somatix, a small vendor with a big idea we’ve been hearing about for years – using data from wearables to track more routine activities of daily living (some of which, like smoking, are harmful). The vendor is attempting to take this to the next level with more accurate gesture detection and predictive analytics so appropriate (and even real-time) interventions can be made using specific apps. As Brian Eastwood recently posted, we’re still waiting for wearables to provide insight. I didn’t sport a wearable at HIMSS18 (I broke two and lost another in 2017), but I’m on the lookout for a good, waterproof one.
Another key area of focus for us is the use of AI to interpret digital medical images. An impressive talk by University of Virginia and the National Institute of Health included use of speech recognition (using Carestream and Epic) to embed hyperlinks of AI-recognized areas of interest into reports for the EHR. The two-year effort showed productivity improvements of 3x over unassisted analysis and reporting.
A presentation by Entlitic claimed AI-enabled “superhuman” techniques able to detect lung cancer two years sooner than existing approaches. Their solution made it easy to compare an existing case to similar cases where timelines of data showed disease progressions. The company has 65 radiologists that label their training data, claiming only 1 in 4 that apply for the job pass their test. We’ll dive into detail about these kinds of advances in our Digital Medical Imaging Report scheduled for Q4’18.
I spent time with Ambra, a major provider of image exchange solutions (others include Nuance and lifeIMAGE). Aside from the challenge of the size of medical images, it always surprises me how difficult it is to move them around and make them available despite good standards (DICOM). It was only recently that Epic, for example, addressed image exchange, and it’s not part of many HIEs. I’m glad to see we’re moving beyond the vendor neural archiving discussion and toward a focus on the cloud and useful exchange of images in clinician workflow.
I also attended half a dozen receptions during the week. The biggest was sponsored by a large consulting firm. It was an evening of fun, but it reminded me of what was right and wrong about our industry and a conference in Las Vegas – who’s really paying that bill? My last reception was with BetterDoctor, which specializes in the quality of provider directory data. It always seemed ironic to me that the most regulated profession in the world has such a problem with accurate information (retirement, credentialing, locations, and so on).
My Ears Hurt
To rework my “I’m Jazzed” comment from the top with a music metaphor, HIMSS is more like a blaring of thousands of different instruments with each of the “sections” competing to be louder than the other – and the sounds of Vegas don’t help. There are many great musicians and an increasing number of duets (e.g., partnerships, ACOs), but we’re still playing off too many different pages. Adding to the problem is the conductor (the government) changing every few years.
It may be more of a cacophony than a symphony, but I’m glad to be in the orchestra. I hope you are, too.