HIPAA, Trust, and the Impact of Data Poverty

by | Sep 27, 2021

Examining recent trends in SDoH, Privacy, and AI

Key Takeaways

Proposed changes to HIPAA (ONC PDF) that would make sharing data with social services organizations easier are raising privacy concerns. As initiatives such as the Gravity Project continue to build standards for SDoH, the privacy concerns of advocates will likely accompany the need for data standards to ensure privacy and security of patient data.

Data poverty is another growing concern that may impact data analytics for SDoH and beyond. Historical biases in clinical research and AI model training may limit application of some tools to under-represented populations, thereby increasing disparities.

Building trust among under-represented populations is critical to address the above data challenges. This requires increasingly data-driven institutions—that have historically used AI models known to have bias issues—to work with community leaders to establish trust. It is never too early to build governance mechanisms that will mitigate the harm of privacy violations and data poverty.

Introduction

Over the summer we covered some of the recent movements for the development of SDoH standards issued by the ONC. However, lost in some of the coverage of standards has been the growing issue of privacy and SDoH data. Accompanying the growing interest in use of SDoH data to improve the quality of care and outcomes is the need to improve data sharing beyond the clinic. There has been a proposal to change federal privacy rules to make it easier to exchange data with social services and community-based organizations (see also the push to standardize the definition of EHI). The intent is to offer additional services to patients where SDoH has an impact on outcomes for individuals. While the goal is noble, it is beginning to raise a number of privacy concerns in some quarters.

Currently, HIPAA allows organizations to share data with social services organizations that are involved with care coordination of patients under the care of healthcare organizations. The Office of Civil Rights has allowed sharing of personal health information (PHI) with social services for years; however, they are finding that HCOs are hesitant to do so in many cases due to fear of HIPAA violations. The recent move to loosen up data sharing rules is a response to address those fears.

Privacy Challenges of SDoH

The most important issue here boils down to the fact that social services organizations are not bound by HIPAA. A great deal of data around SDoH could be linked to behavioral and mental health and expose patients to stigmatization and potential employment challenges, if accessed by a third party. Information blocking rules provide social services organizations access to a mechanism to file complaints if they cannot access PHI that is required for providing care. HCOs can also use business service agreements to oblige social service organizations to abide by HIPAA when receiving PHI from the healthcare provider.

We have been monitoring how SDoH vendors are addressing privacy concerns as this issue rises in visibility. Signify Health and Independence Blue Cross launched a platform for hosting a network of community-based organizations in the Philadelphia region called CommunityLink in 2020. Signify Health and Signify Community (a part of Signify Health) created a privacy framework for SDoH data and collaborations through a rules-based engine where members can give consent on what information can be shared, when, to whom and for what purposes.

We recently covered HSBlox and their use of SDoH standards, as well as a blockchain-based consent management system that enables a similar form of consent management and creation of longitudinal records with SDoH data. This type of approach is likely going to become more common as SDoH vendors mature.

Balancing Privacy with Concerns for Health Data Poverty

From the early months of the pandemic the disproportionate disease burden on low-income communities has been on public health and HCO radars. While privacy advocates raise important concerns about data sharing risks, we have also been seeing a growing number of editorials in medical journals raising flags about how digital health and AI tools could further exacerbate health inequities. One of the important areas is in the realm of “data poverty”.

A recent Lancet piece notes that we have systemic differences in the quantity or quality of data representing different groups or populations. For example, as of 2018 genomic wide association studies (GWAS) were 78% European, 10% Asian, 2% African, 1% Hispanic. AI models used in medical imaging are disproportionately trained on data from California, Massachusetts, and New York, with little data from the other 47 states. The authors conclude that there are implications for those excluded in these datasets; namely, that they may not be able to benefit from data-driven tools or could potentially even be harmed.

The authors of this study point to the need for more citizen engagement about data and how data about individuals and communities can be utilized for building better AIML applications, as well as healthcare delivery systems that can meet the needs of different types of communities. Filling in the data gaps for diverse communities will require building trust, both so that these communities understand how their data are used, and so that they realize the benefits of sharing data with technologists and health systems as a two-way street.

Source: Leslie et al. (BMJ)

COVID and Data Disparities

We witnessed unprecedented levels of data sharing in the early months of the pandemic that helped generate insights on treatments, surveillance, and a wide number of scientific challenges a novel virus imposes on health systems. The COVID-19 Therapeutics Acceleration and COVID-19 Data Research Alliance are two of the most important examples.

Despite these success stories, David Leslie et al have raised questions of how AI tools used during COVID could increase inequality. For the same bias issues listed above, they note that investments in representative datasets, independent evaluation and replication of models, and access to data for regulators will be some of the steps needed to limit the harm that could be posed from deficits in data for some populations.

Conclusion

We are still in the early days of SDoH utilization in health systems; however, the challenges beyond the usual set of interoperability and standards issues still need to be addressed for the long-term success of these programs. In this post we examined some of the privacy concerns and linked these issues to the growing use of AI in healthcare, including SDoH platforms. A number of companies are taking proactive measures on the privacy front, and these efforts are critical to building trust in communities that subsequently will contribute to addressing the data poverty issue.

Organizations can begin to lay the groundwork for building trust and protecting privacy by setting up an SDoH governance committee that will take a deep dive on the issues and begin the necessary training around issues such as the following:

  • The unique privacy issues associated with SDoH data and differences between individual and community level data in order to raise and socialize the privacy issues and governance structures that need to be in place
  • Develop specific policies on which data can be shared with community organizations and how
  • Review consent management policies and mechanisms for enabling opt-out and notifications when data are being shared
  • Work with EHR teams to identify ways to capture SDoH better and determine which data are actionable

SDoH data governance mechanisms are still in development in most organizations, but we are beginning to see useful frameworks emerge from AHIMA, HIMSS and from leading vendors in the SDoH space.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Related Content

HIMSS24: Back to Form but Haunted by Change Healthcare

HIMSS24: Back to Form but Haunted by Change Healthcare

Good luck trying to get noticed for anything other than AI or cybersecurity HIMSS24 was the first HIMSS national conference that I will have missed since I first attended in 2012. It felt weird not to be there with all my friends and colleagues, and I certainly missed...

read more
ViVE 2024: Bridging the Health 2.0 – HIMSS Gap

ViVE 2024: Bridging the Health 2.0 – HIMSS Gap

Workforce / capacity issues and AI – and where the two meet – are still the two biggest topics on clinical executives’ minds right now at both ViVE 2024 and HAS24. Probably the first time I’ve seen the same primary focus two years in a row – historically we’ve always seen a new buzzword / hype topic every year…

read more
Powered By MemberPress WooCommerce Plus Integration