As healthcare organizations increase AI capabilities and, by extension, tap into a broader and deeper flow of healthcare data, there’s one question they may want to answer sooner rather than later: Will accessing larger volumes of data lead to more litigation?
That may not be the first question healthcare stakeholders want to entertain as they puzzle through their AI options, but according to a recent commentary by a pair of attorneys, they should definitely be thinking about it.
Writing recently at STAT, Patricia S. Calhoun, a healthcare attorney with a focus on privacy issues, and Patricia M. Carreiro, a data privacy and cybersecurity litigation attorney, argue that “big data has greatly expanded what information could be considered ‘individually identifiable,’ particularly when it is in the hands of data giants that already have so many other individual data points they can combine to identify an individual.”
In other words, they says, because of the information that “Big Data” already has, “it may be almost impossible to find that a data giant could not combine almost any information with other data to identify an individual — meaning it may be practically impossible to say that data can ever fall within HIPAA’s ‘de-identified data’ exception.”
To back up their point, they point to the Consumer Technology Association’s recently issued standard definition and characteristics of AI data in healthcare, which “recognizes that even if all personally identifiable information is removed from a dataset, de-identified data might still be at risk of being re-identified.”
Moreover, they say, while the Department of Health and Human Services has provided additional guidance on de-identifying data, “its guidance is not without ambiguity. And even if it were unambiguous, courts could disagree.”
In short, they say, healthcare companies involved in AI-focused partnerships should err on the side of caution and “not rely on HIPAA’s de-identified data exception without considering the risk of litigation. While there is no private right of action under HIPAA (meaning that individuals cannot sue to enforce it), individuals can still use a provider’s noncompliance with a HIPAA standard as a basis for suing a provider for being negligent with the individual’s information.”
For the time being, then, “unless and until HIPAA’s de-identified data exception is amended, entities relying on it for AI partnerships face increased risk of litigation. At this point, the only way to completely avoid this risk is to obtain HIPAA-compliant patient authorization for the disclosure of data.”