The Rise of APIs and App Stores In Healthcare
Two years ago, we published a report on the promise of open APIs in healthcare. In APIs for a Healthcare App Economy: Paths to Market Success (available as a free download), provider organizations told us that developing and using APIs was low on their list of priorities. Modern REST-style APIs were still not on the radar for most providers and payers.
Back then, HCOs large and small said they expected their EHR vendor to build an API infrastructure for them. Two years ago, only Allscripts and athenahealth offered an app store along with a comprehensive developer support program. At that time, the other EHR vendors were slow-walking FHIR support and had vague plans for app stores and developer support programs. We found that:
Since then, market conditions have continued to change. EHR vendors are now more vocally rolling out the API infrastructures that will bring healthcare into the mainstream of 21st century computing. Every major EHR vendor has delivered a variety of proprietary, HL7, FHIR, and SMART APIs along with the ability to leverage REST to improve their products.
Each of these companies sponsor, or will soon sponsor, an app store for third-party innovation. This has seen a concurrent rise in interest for using APIs within provider organizations. A recent Chilmark report, Healthcare App Stores: 2018 Status and Outlook, examines some of these platforms and the progress that has been made to date in more depth.
That said, some things have not changed. EHR data remains the most valued data resource in healthcare. All but the largest provider organizations are dependent on their EHR vendor for API enabling technologies. Small and independent developers struggle to participate in app stores and EHR developer support programs, despite great ideas for better apps to improve care delivery. And physician frustration with EHRs continues to grow.
Developing and using APIs is a priority for healthcare stakeholders who want to get more from their EHR investments as they identify opportunities for workflow improvements, real-time analytics dashboards, and more. While EHR vendors are leading the charge, our more recent research suggests that many other stakeholders hold or control access to other key data sources that could underpin such efforts.
The opportunity to capitalize on data already collected to provide advice and predictions is too big to ignore, and the easiest way to do this efficiently will be with integrated apps that can pull data from any relevant resource to provide the necessary insights at the right time.
The number of apps in EHR app stores grows monthly. To date, the idea of the potential role and benefits of an independent certification body has not entered the discussion about APIs and app stores since the collapse of Happtique’s efforts in 2013. Currently, EHR vendors certify apps for their customers based on rigorous internal evaluations, but the process varies by vendor. An independent and impartial body could do more than just provide information for prospective users. Instead, it could deliver tremendous value if its assessments were multi-pronged and supplied information about the ongoing use of the app, as well as a consistent way to think about safety, security and dependability. While a certification authority could make it easier for decision-makers, the real value could be in delivering users more information about how the app delivers value across its install base.
Sometime in the next few months, the ONC will issue new rules on information blocking and what constitutes an API that does not require “special effort” to access and use. While these actions may seem like a watershed moment for health IT, the provider market has moved perceptibly since ONC began its rulemaking. Just two years ago, providers were curious about APIs and app stores but they weren’t ready to make any commitments.
Slowly and inexorably, healthcare is embracing the downloadable app as a tool for innovation and improvement. One-size-fits-all platforms are not meeting the needs of the industry and apps can do more to assist with care provision needs than just provide supplemental functionality – to read more about opportunities for apps to have significant impact, take a look at the infographic we developed to accompany our more recent report, or read more in this deep dive post from June.
The opportunity to capitalize on data already collected to provide advice and predictions is too big to ignore, and the easiest way to do this efficiently will be with integrated apps that can pull data from any relevant resource to provide the necessary insights at the right time.
Da Vinci Project: FHIR Meets Value-based Payments
While healthcare frets and obsesses about the state of exchange between providers, payers have been relatively slow to embrace modern ideas about data movement. On the federal level, Blue Button 2.0 lets Medicare beneficiaries download or authorize the download of their own health data. This federal program has generated a lot of interest, signing up roughly 500 organizations and 700 developers at last count. Claims data on 53 million beneficiaries spanning 4.5 years is available for authorized developers and applications. Commercial payers have yet to emulate such an approach. However, there are indications that they are beginning to take notice and act.
The Da Vinci Project is a new private sector initiative to address the technical requirements for FHIR-based data exchange among participants in value-based care (VBC) programs. Its general goal is to help payers and providers with techniques and ideas for information exchange that contribute to improved clinical, quality, and cost outcomes. More specifically, it wants to facilitate the creation of use case-specific reference implementations of FHIR-oriented solutions to information exchange challenges in value-based care.
The Da Vinci Project is a relatively new project in from Health Level 7. Its founders represent organizations with experience across a range of VBC business challenges and the FHIR standard. Currently, it has the support of 27 organizations, including 11 payers, 10 health IT vendors (including 3 EHR vendors), and 6 providers.
Making coverage information REST-accessible and granular could vastly improve on the flurry of phone calls, faxes, and EDI-based document exchange that bedevil hospitals and physician offices.
Da Vinci aims to deliver implementation guides (IG) and reference software implementations for data exchange and workflows needed to support providers and payers entering and managing value-based contracts and relationships. FHIR implementation guides are roughly analogous to an IHE profile in that they define actors and actions in a defined use case. In this instance, the idea is to help providers and payers operationalize their complementary processes in value-based contracts.
Da Vinci at one time contemplated creating nine use cases. It eventually narrowed this focus to developing IGs for two that have particular VBC relevance:
Medication reconciliation is a time-consuming part of patient encounters anywhere on the care continuum. Post-discharge, this capability can reduce the incidence of adverse drug events and head off re-hospitalizations. The objective of this project is to create a simple workflow that allows providers to attest that a medication reconciliation is complete.
Coverage requirements discovery enables a provider to request payer coverage requirements in their clinical workflow. In-workflow coverage details can make point-of-care decision making about orders, treatment, procedure, or referrals more efficient. The ability to make both clinical and administrative decision mid-encounter and with a patient minimizes the non-clinical cognitive burden on providers. By reducing denials and delays administrators can be freed for other patient-specific work. As prior authorization becomes more common in value-based payment programs, providers could save time if such information were more readily available to them.
Both IGs will be balloted in HL7’s September Ballot Cycle, and Da Vinci is actively encouraging people to comment. Da Vinci team members will be on hand to help implementers interested in these two initial use cases at HL7’s September Connectathon in Baltimore, MD. It has also begun working on requirements for a third use case, Documentation Templates and Payer Rules.
While the data scope of this effort is not nearly as broad as Blue Button’s, which provides access to a patient’s claims history, it is a reasonable first step. Making coverage information REST-accessible and granular could underpin applications, vastly improving the flurry of phone calls, faxes, and EDI-based document exchange that bedevil hospitals and physician offices. Hopefully, this effort is a leading indicator of better access to commercial payer data to come for patients and providers.
Promoting Interoperability: MU Fades to Black
Seeking to liberate the industry from its self-created morass of siloed data and duplicative quality reporting programs, the Department of Health and Human Services (HHS) issued 1,883 pages of proposed changes to Medicare and Medicaid. It renamed the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs (known by all as Meaningful Use) to Promoting Interoperability Programs (PI).
As widely reported, it would eliminate some measures that acute care hospitals must report and remove redundant measures across the five hospital quality and value-based purchasing programs. It would also reduce the reporting period to 90 days. HHS will be taking comments until June 25, 2018.
HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place.
HHS believes that requiring hospitals to use 2015 Edition CEHRT in 2019 makes sense because such a large proportion of the hospitals are “ready to use” the 2015 Edition. Ready to use is not the same as using. 2015 Edition EHRs may not be as widely deployed as HHS indicates. The following 10 month old snapshot from ONC shows hospitals have not aggressively moved to adopt 2015 Edition CEHRT.
Current adoption levels by HCOs are undoubtedly better, and many vendors have 2015 Edition technology ready to go, but hospitals can only change so fast. The rush to get hospitals on the most current edition has to do with the most relevant difference between the 2014 and 2015 Editions – the API requirement. APIs will be the technical centerpiece of better, more modern interoperability but adoptions levels are still low. APIs, by themselves, offer the promise of better data liquidity. For this promise to become a reality, healthcare stakeholders need more than just a solid set of APIs.
HHS is also proposing that hospitals post standard charges and to update that list annually.
This is a nice thought, but it will take some heavy lifting to pull this off. For starters, HHS doesn’t even have a definition of “standard charge” and is seeking stakeholder input before the final rule is published. HHS also must determine how to display standard charges to patients, how much detail about out-of-pocket costs to include (for patients covered by public and private insurance), and what noncompliance penalties are appropriate.
Above all, there’s the thorny issue of establishing what a standard charge is in the first place. Charges vary by payer. Can a hospital truly state, without a doubt, the cost of an MRI or a colonoscopy? Most cannot – and technology alone will hardly solve this problem.
The existence of APIs will stand in the stead of the old view/download/transmit (VDT) requirement. Regarded as one of meaningful use’s most troublesome and fruitless requirements, this rule has been shed by HHS because of “ongoing concern with measures which require patient action for successful attestation.”
VDT is one of several MU Stage 3 requirements pertaining to patient engagement – along with providing secure messaging or patient-specific educational resources – that HHS has proposed dropping, under the pretense that it is “burdensome” to healthcare providers. While hospitals have struggled to get many patients to participate, the VDT requirement set the bar at one patient out of an entire population. What’s more, dropping the requirements fails to take into account how burdensome it is for patients to try to access their data, communicate with their physicians, and learn about their conditions and treatment options. It is also contrary to CMS Administrator Seema Verma’s remarks, first at HIMSS18 and again this week, indicating that the agency seeks to “put patients first.”
HHS says that third-party developed apps that use APIs will deliver “more flexibility and smoother workflow from various systems than what is often found in many current patient portals.” Whether such apps deliver “smoother workflow” is not a foregone conclusion.
HHS proposes “a new scoring methodology that reduces burden and provides greater flexibility to hospitals while focusing on increased interoperability and patient access.” The proposed scoring methodology uses a 100-point system (explained over 24 pages) in which attaining a score of at least 50 means there will be no Medicare (or Medicaid) payment reduction.
HHS is also mulling whether to abandon these measures altogether in favor of scores calculated at the objective level.
The biggest regulatory effort in recent months related to interoperability, other than this proposal, has been ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA), required under the 21st Century Cures Act. TEFCA, well along in the planning stages, is a new set regulations from ONC whose goal is to catalyze better data availability using APIs. HHS in this regulation wants public comment on whether participation in a TEFCA-compliant network should replace the process measures in Health Information Exchange objective. Stated another way: Should TEFCA compliance replace 80 percent of the score for PI (75 percent in 2020)?
TEFCA is widely expected to provide a safe harbor from data blocking liability although ONC has been mum on this point. TEFCA then could do double duty: Eliminate the need to meet or report on health information exchange metrics and provide a shield from data blocking enforcement.
But there are, as yet, unanswered questions about TEFCA:
HHS is also considering doing away with Public Health and Clinical Data Exchange objective. It floated the idea that a provider that supports FHIR APIs for population-level data would not need to report on any of the measures under this objective. This would replace 90 percent of the score for PI (85 percent in 2020) when combined with the TEFCA knockout.
The specific API mentioned, called Flat FHIR and still in development, will probably contribute to part of the complex process of public health and registry reporting. This activity currently requires highly skilled data hunter-gatherers, usually with clinical credentials. In many organizations, these hunter-gatherers manually sift and collate multiple data sources to meet the varied requirements of the recipients of different registries. Flat FHIR, assuming it were production-ready, will certainly help, but it is unlikely that it could provide all, or even most, of the information needed for the range of public health reporting programs.
HHS acknowledges that providers are less than thrilled with aspects of the Quality Payment Program (QPP). HHS wants to know how PI for hospitals can better “align” with the requirements for eligible clinicians under MIPS and Advanced APMs. In particular, it wants ideas about how to reduce the reporting burden for hospital-based MIPS-eligible clinicians. It is undoubtedly looking for market-acceptable ideas to reduce the reporting burden where it is arguably more deeply felt – among non-hospital-based MIPS-eligible clinicians. While reducing or eliminating the reporting burden would help such providers, the big unanswered question, as it is with hospitals, is the burden of getting to 2015 Edition CEHRT.
HHS also asks the industry how it could use existing CMS health and safety regulations and standards to further advance electronic exchange of information. It is ready to change Conditions of Participation (CoPs), Conditions for Coverage (CfCs), and Requirements for Participation (RfPs) for Long Term Care Facilities regulations to this effect. It wants to know whether requiring electronic exchange of medically necessary information in these regulations would move the interoperability needle.
HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place. It roundly ignores the mesh of incentives that make stakeholders unwilling to share data and patients unable to access data. The industry has cried out for less process reporting and better insight into outcomes for years. This will accomplish the former but set the industry back with respect to the latter if interoperability is declared solved based on technology alone.
Cerner Impressions: Beginning Signs of PHM Maturation
Two weeks ago, we attended the 2017 Cerner Collaboration Forum, which combined the Population Health Summit with three other annual events (Ambulatory Summit, CareAware Summit, and Cerner Physician Community). This larger single summit resulted in a broader representation of provider organizations and other attendee. The “power” of combining these various events also reflects the growing convergence of technologies, workflows, and practices that are necessary as providers, payers, and other stakeholders further shift to valued-based care.
Here are four key impressions of the event.
PHM presentations have matured. The client population health management (PHM) presentations employed a case study approach and included results from deploying various HealtheIntent solutions. This was a change from the past two years, which focused more on implementation issues and presented limited utilization and outcomes data. Early adopters of HealtheIntent now have two or three years of experience using some of the solutions (HealtheRegistries, HealtheCare) and have started to really focus on operational and tactical-related issues, ranging from integration with transition management processes to population health maintenance to care team workflows.
Starting a provider-owned health plan is not for the meek. Bill Copeland from Deloitte Consulting spoke about research on 224 provider-owned plans with more than $1 billion in annual revenues. Deloitte has identified four success factors for these types of plans.
• Core line of business. It is a significant percentage of their overall payer mix and the HCO has lengthy experience in treating those patients prior to enrolling them in a provider-owned health plan.
• Tenure. The 10 most profitable plans were all more than years old.
• Scale. Plans need more than 100,000 members in order to have an adequate risk pool.
• Market Share. In their market, they rank at least second or third in market share, with the ability to buy services out of network at reasonable costs.
What really caught my attention was that it took a minimum of five years for a new provider-owned health plan to become profitable, with significant losses realized in the first two or three years. This should them significant pause to health systems with constrained cash flows.
Provider demand for APIs is strong, but they have questions. A half-day event focused on FHIR and APIs preceded the summit. Provider interest is strong – Cerner has more than 1,100 applications in its sandbox, and this number is increasing rapidly – but a few key questions remain.
• Security or convenience? Which of these two attribute will patients demand more of, and what tradeoffs are they willing to make in the process?
• Vendor role? Providers and potentially patients need more guidance and intelligence on the information brought in via APIs especially in regards to the variety and veracity of the data. Using this data to provide analytic insights is still a bit down the road.
• Increasing utility? Providers ultimately want API-supplied information put into their flow charts/templates within their EMR and where else appropriate, with little manual intervention.
• Who regulates? There’s a large degree of ambiguity at the federal level as to which agency or agencies (FDA, FTC, HHS, etc.) will regulate privacy, security and competition-related issues. This has caused some hesitation among providers and vendors.
Health IT is an enabler, but more is needed for community-level change. A panel on Healthy Nevada, a five-year partnership between Cerner and the city of Nevada, Mo., discussed the lessons learned and impact of the partnership.
Since 2012, Vernon County has moved from 88 to 60 in Missouri’s county health rankings (as updated annually by the Robert Wood Johnson Foundation). HealtheIntent gives the community a single source of community-wide data and access to real-time public health information, which is used to segment the population and begin to employ ‘hot-spotting’ interventions.
What struck me during the panel and in a few conversations I had afterward, though, was that, despite the desire to connect healthy lifestyles, creating a public health or health system-led program simply will not work without broader engagement with various civic organizations, especially in rural areas or among certain population segments. Cerner and a number of other vendors are starting to place a greater emphasis on consumer and caregiver engagement functionality – but as our recent Care Management Market Trends Report suggests, there is much progress to be made.
Will small providers be left behind? This year’s Cerner Collaboration Forum marked the beginning of a maturation of provider attendees who have been on the HealtheIntent platform (HealtheRecord, HealtheRegistries, HealtheCare) for a few years now. Early adopters are moving away from implementation-related issues and starting to fine-tune how these solutions support specific care management and care coordination use cases and workflows.
Combining all the events in a single one also provided a broader scope of attendees. We received feedback from a wider array of providers, including smaller customers who still have not made a substantial PHM investment or others who were reevaluating earlier investments in point solutions and looking for a more comprehensive “go forward” data strategy and IT investment.
The question is whether most small to midsize providers are willing and able to make the necessary PHM investments including in Health IT moving ahead forward. The Cerner event did not really cover this topic and there are mixed signals in the market place right now about this market segment right now including when PHM buying activity may begin to pick up in earnest.
Can APIs Really Improve HIT?
We hosted a webinar on this topic on December 14, 2016. To retrieve a copy of the slides, please follow this link. We will also be posting the recording.
A critical element of business success across industries has been the surge in use of open application programming interfaces (API) that provide data for applications that did not create or originate the data. APIs are the technical foundation of engaging interfaces and high-value interactions between different applications. Application ecosystems such as Google Play and Apple App Store would not exist without open APIs that enable data access across multiple sources and organizations. APIs have fueled the growth of application vendors like Salesforce, Workday and countless others. Open APIs in healthcare promise a HIPAA-compliant way to enhance a digital portfolio with an ecosystem of third party applications and services.
To understand what it will take to build an API program in healthcare, we conducted a broad survey of the healthcare market to solicit ideas and opinions about the opportunities and challenges represented by APIs. On December 14, 2016, we will conduct a webinar that will discuss the findings of this research and implications for the health IT industry as a whole.
These interviews indicate pervasive dissatisfaction with existing systems of record and systems of engagement. But they also show significant enthusiasm for wider availability of APIs that can make it easier to develop and improve these systems.
The API Opportunity in Healthcare
The API Challenges
This webinar will provide an overview of our distillation and analysis of the topics raised by various industry stakeholders as they look beyond custom-built one-to-one interfaces to an infrastructure that relies on repeatable, portable and reusable ways to get data and invoke functionality to optimize care across the community they serve.
Salesforce Enters the Fray: Will They Succeed Where Others Have Failed?
After several years of circling the healthcare market, Salesforce finally announced its intent to formally enter the market this week with the announcement of Salesforce Health Cloud. Unlike other enterprise vendors who have jumped into this market, with Blue Ribbon advisory panels (Google Health), or series of acquisitions (IBM Watson Health, Intuit) or a mixture of both (Microsoft), this announcement by Salesforce had little in the way of any of these attributes to bolster its announcement.
Salesforce is taking a much more tentative and low risk approach to entering the healthcare market and will look to its expansive ecosystem of partners who will leverage Salesforce’s existing tools to create healthcare specific solutions and services.
JASON Task Force – Part Deux
In a post two weeks ago, we were critical of some aspects of the JASON Task Force’s (JTF) Final Report on healthcare interoperability. Two members of the JTF reached out to us in order to clarify the intent of the report as it relates to EHRs and the use of a “public” API to help make healthcare applications more interoperable. During a long conversation, we had a chance to discuss the issues in detail. Following that discussion we took some time to reconsider our opinions.
We now have to agree that the JTF itself was not EHR vendor dominated and have corrected the previous post. The Task Force was comprised of a wide range of stakeholders including several providers. Unfortunately, however, the testimony the JTF received was overwhelmingly from HIT vendors, consultants, or their proxies. We doubt this was intentional, but simply the vendor community having a more vested interest in influencing the JTF. But it does lead us to the conclusion that it is incumbent upon the JTF to proactively solicit provider testimony before policymakers act on the recommendations of the JTF report.
Despite our long conversation with these members of the JTF, we still have a difference of opinion on one key issue: The central importance of the EHR with regards to public APIs and interoperability.
The original JASON Report points squarely at EHRs as the source of interoperability ills. It also called for EHRs to adopt the public API. By our count, the JTF Final Report uses three different ways to describe where the public API should sit: a “Data Sharing Network”, “CEHRT”, or “clinical and financial systems.” In our follow-up discussion, JTF representatives maintained that the intent to include EHRs is clear and that the task force struggled on this issue of how broad their mandate was.
The JTF decided to cast a broader net than just the JASON Report’s initial focus on EHRs. But they did not clarify an already complicated issue, nor did they unequivocally single out EHRs as where the need for a public API should begin. We think that their intention to include EHRs is sincere but maintain the position that the JTF should explicitly recommend that EHRs expose services and data with the public API. Without such clarity, the fuzzy language used in the JTF report could end up being adopted in future rule-making or legislation, creating the potential for uncertain outcomes.
Good, bad, or otherwise, the EHR is the dominant application supporting clinical workflows and the source of most patient healthcare data.
Every provider we have ever talked to says that improved patient care and more effective care coordination would be possible with better access to other providers’ EHRs. On the other hand, we have not talked to many providers who say that better patient care and better care coordination would be possible if only there was better access to other providers’ financial systems. The majority of providers have never heard of a Data Sharing Network (and no, we do not believe Direct can fill this bill) so the public API is pretty much dead in the water there as well – though most any HIE/CNM vendor worth their salt would welcome a public API.
So let’s be perfectly clear in the JTF report – if we want EHRs to adopt a public API, then let’s just say so rather than beating around the bush. To do otherwise sends the wrong message to the market – that EHRs are somehow not central to the interoperability problem.
JASON Task Force Says Status Quo is Better than Supersession
The JASON Report created quite a fuss in the HIT marketplace as some screamed foul and others were encouraged that maybe, just maybe the JASON report may force movement to more open systems. To clear the air, the JASON Task Force (JTF) was formed to solicit industry feedback for policy makers. The JTF released their findings earlier this month.
The JASON Report was an AHRQ- and HHS-sponsored study of healthcare interoperability issues. Its basic conclusion was that the existing EHR-based HIT infrastructure should be superseded by something more open and amenable to use by other applications and across organizations. The JASON Report advocated radical solutions to the interoperability crisis: using MU3 to replace existing EHRs and requiring a uniform set of APIs for EHRs across the industry.
Vendor response was rapid and unified. HITPC appointed a task force representing stakeholders from across the industry (virtually all have been on other ONC workgroups, so somewhat cloistered) who worked with alacrity through the summer. The tone of vendor testimony before JTF reflected a level of alarm that contrasts sharply with HCO’s non-participation.
JTF and its vendor members have some legitimate beefs: the JASON Report is not exactly disinterested. It substantially reflects the view of the clinical research community which sees itself as the long-suffering victim of EHR intransigence. The JASON Report glosses over genuine, if crepuscular, progress in healthcare interoperability. Another point that we believe has not been made forcefully enough by EHR vendors is that they are constrained by their HCO customer’s ability to change. The organizational obstacles to healthcare data liquidity are significant and EHR vendors move only as fast as HCOs despite their claims to the reverse. However, we think that JTF is wrong to deflect attention away from the EHR-oriented APIs.
JTF’s proposed alternative to EHR supersession involves something it calls Data Sharing Networks (DSN). These are a rebranding of the HIE supplemented with a uniform set of APIs to support access to something never specified in much detail. JTF suggests that these APIs be based on the replacement to HL7 – FHIR.
Without doubt, FHIR represents a significant improvement over HL7 along multiple dimensions. But the idea that FHIR alone can cure the interoperability ills of healthcare is all smoke. Behind this smokescreen, EHR vendors are hoping that people eventually lose interest or stop talking about interoperability. With this bit of redirection, JTF has basically let the EHR vendors off the hook.
This begs the question: Where is the best place to have a uniform set of APIs reside, the DSN (HIE) or the EHR?
Our answer: Both!
The HIE is really a stopgap measure in the sense that discrete access to EHRs and other data sources across organizations via a uniform set of APIs and SOA will greatly reduce the need for an HIE. If applications could access all of a patient’s data directly from native data sources in different HCOs, there isn’t much point in maintaining separate and comprehensive CDRs at different sites in the overall healthcare system.
But rather than move in this direction, the JTF favors the politically powerful EHR vendors at the expense of the HIE vendor community.
No doubt creating a set of uniform APIs to EHRs would be costly. Upward and backward compatibility, a hallmark of every successful IT platform, requires deeper pockets than most EHR vendors can muster. But some EHR vendors are better positioned to support such APIs than others. Many hospital EHR vendors could make the investment. Smaller, community-focused, or client-server based EHR vendors and their customers though would struggle.
Our HIE research has shown – year after year – that data flows downhill from hospital to community. Hospital-based EHR data is valuable to community-based clinicians. It is also extremely valuable to those hospitals to ensure that physicians in a community get discharge summaries to minimize readmissions and associated penalties. Hospital-based EHRs are a good place to start with uniform APIs. The reality is that community-based EHR data could also be better used in hospital settings to facilitate care. This is especially true as we move away from fee for service to more risk-based payments models.
Unfortunately, facilitating data flows between community and hospitals is something we’ll be patching together with string, baling wire and duct tape for the duration. The JASON Report and subsequent JTF report have not moved the ball forward on this issue. It is our opinion that there is little that the policy folks in Washington D.C. can do with additional prescriptive meaningful use requirements. HHS would better serve the market by using financial incentives that promote healthcare organizations to demand better interoperability capabilities from their vendors as it is the customer that vendors really listen to, not D.C., policy wonks.