Cover Story
Using AI in hospitals: the HIPAA hurdle
The use of AI in hospitals and related healthcare settings is possible, but there are various HIPAA compliance considerations to navigate, writes Ross Law.

Credit: Pix Sell / Shutterstock
As the dimensions of an imminently more digitised, connected world grew clearer in the 1990s, the broad-ranging US Healthcare Insurance Portability and Accountability Act (HIPAA) was signed into law by the Clinton administration in 1996.
The regulation’s primary purpose was to protect health care coverage for individuals who lose or change their jobs. To meet HIPAA’s additional requirements around the use and disclosure of individuals' electronic protected health information (ePHI), the US department of Health & Human Services (HHS) subsequently drafted the HIPAA Privacy Rule, which became effective in April 2003.
The Privacy Rule was created to ensure that individuals' health information is properly protected, while also allowing the flow of health information needed to “provide and promote high quality health care” and protect the public's health and wellbeing.
While HIPAA’s aims were all well and good at a time when those behind the legislation were likely thinking in terms of protecting the transmission of ePHI via the arcane mediums of fax machines and pagers – both largely defunct technologies in the modern day – they could not have accounted for the rapid rise of artificial intelligence (AI) and its growing prevalence in healthcare.
For hospitals and other healthcare settings in the US, AI chatbots, systems, and models have found their way into medical devices such as robotic surgery systems and CT/MRI scanners in radiology as diagnostic support aids. AI chatbots are also driving efficiencies around practitioner notetaking during consultations or being applied more broadly to hospital management processes such as patient triaging or in consolidating patients’ healthcare records.
In remaining compliant, hospitals need to carefully consider how the interaction between ePHI and AI can remain HIPAA compliant, or how these interactions can be undertaken in such a way that they fall outside of HIPAA’s remit. However, it is worth noting that there is a degree of interpretation around AI and HIPAA since the regulation does not explicitly mention AI. Some observers have deemed HIPAA an overtly mercurial regulation for this reason, while others have advised recognition over the regulation’s shortcomings to gain a “better understanding about the challenges related to compliance” in the context of AI.
The challenges around AI deployment in hospitals
Most AI devices or platforms require access to ePHI to be useful, and that’s where hospitals “have to pause”, says Nirav Chheda, co-founder and CEO of non-emergency medical transportation service Bambi NEMT.
“You can’t just drop in a model that pulls ePHI data without proving how that data’s encrypted at rest and in transit, how user access is managed, and how incidents are logged.”
A further problematising factor of using AI in hospitals relates to integration and the fact many hospitals rely on fragmented legacy infrastructure.
Chheda continues: “An AI system that predicts ER wait times might need data from patient admissions, bed availability, and discharge schedules – data that isn't always in one place or in a usable format.
“That’s where compliance risk grows, especially when data has to be pulled from multiple sources.”

Astrocytes are a type of neural cell that builds the BBB, and Excellio plans to derive exosomes from them to make them even better at targeting the brain. Credit: ART-ur / Shutterstock
Confidential computing
A core tenet of HIPAA lies in having appropriate security controls for safeguarding patients’ ePHI, but AI models like ChatGPT are not really designed for this, says Peter DeMeo, chief product and revenue officer at cloud provider Phoenix Technologies.
“Even OpenAI currently hasn't entered into any kind of business associate agreements with any healthcare entities, meaning they are effectively not certifying that the patient communications or anything else that goes into their models wouldn't violate the HIPAA regulations.”
In this context, business associate agreements refer to legally binding relationships between HIPAA-covered entities and business associates, such as AI model developers, to ensure complete protection of ePHI.
AI models are commonly hosted on cloud-based architecture and secured by confidential computing to meet HIPAA compliance. Confidential computing can be thought of as a silo in which an AI model and everything inside is completely walled off and secured.
Phoenix deploys cloud-hosted environments, supported by technology from IBM, to enable ePHI data to be acted upon by AI models or agents while aligning with HIPAA’s data protection rules.
Within these environments, the Swiss company hosts its own conversational AI model. With this setup, nothing gets “saved on the server, nothing is in a log, nothing is going back to anybody”, explains DeMeo.
“And in that regard, users can start to build AI agents and models around HIPAA-protected data, because it is acting on a data source protected by confidential computing.”
“A lot of the different cloud providers use confidential computing environments, but one main difference is that US-based cloud companies are all subject to the US CLOUD Act of 2018, as well as the Foreign Intelligence Surveillance (FISA) court,” DeMeo adds.
These regulations mean large-scale cloud service providers (hyperscalers) such as AWS have a key to unlock and decrypt data and could provide it to any relevant authorities that may request it.
Since Phoenix operates from Switzerland, it is not subject to the US Cloud Act.
“Where we differ is that, as a provider, we don't have the key,” says DeMeo.
This ensures that companies have complete sovereignty and control over their data at all times – a potentially critical differentiator at a time where the Trump administration appears to be taking steps to actively undermine data privacy protections in the US.
Data de-identification – is it enough?
While insufficient for AI models’ use in areas like real time monitoring, another approach to enable patients’ data to be used by AI models is by de-identifying it – thereby putting it outside the scope of HIPAA.
However, according to Lisa Acevedo, health care lawyer at Polsinelli, there's a concern that due to the power of generative AI (genAI) and the fact so much data is being pulled from multiple potential sources, the risk of re-identification of individuals is increased.
“The use of AI tools in their hospital or practice is an immediate concern for clients we interact with, but they're also concerned, more broadly, about manufacturers’ continued possession of data, even if it's considered to be de-identified.”
Michael Gaba, food and drug vice chair at Polsinelli, suspects that given the relative learning curve everyone is on with AI, law and regulation will continue to “play catch up”.
“From an FDA and other agency perspective, and I suspect even within the Office of Civil Rights at HHS, we try our best to utilise the existing frameworks we have and presume they're going to work,” says Gaba.
“We hold up the new technology to the existing rules, and oftentimes persuade ourselves that it should work.
Gaba concludes: “But over time, it may become clear that it doesn’t work as well as we thought, and we come to understand better why it doesn't work, and then we try to react to that, perhaps by developing new regulation or new statutory authority to engage in certain behaviours.”
HIPAA feels like a regulation fit for purpose in a different era. There have been several amendments to the regulation since its inception, with the most recent proposal published in December 2024 to account for the increase in cyberattacks on healthcare infrastructure. With RFK Jr now helming the HHS, and the recentworkforce cuts, it remains to be seen whether this revision will come to fruition, and if it does, whether further clarity on AI implementations will follow.
The possibility of self-learning is made possible by the relatively lower cost of Butterfly ScanLab, given it would not be possible to provide each student with a $65,000 ultrasound system the likes of which Arnce uses in his work as an ER clinician.
“When I started using Butterfly to teach, the price point was somewhere in the $4,000 range, so it’s not out of the realm of possibility to have an ultrasound in every student’s hand, and that’s a large part of why Butterfly made sense for us.”
According to Arnce, students who have gone through the KCU POCUS curriculum stand out when applying for residencies due to their competency in using the technology.
“When you’re looking at an applicant, distinguishing one candidate from another comes down to how students perform when they’re on audition rotations during residency.
“I’ve had multiple students who’ve thanked me for the POCUS training, because they were, for instance, working in the ER and had a trauma come in, and other residents couldn’t get the scan, but they could.
“If POCUS is a skill you already have, it’s one less thing you need to learn, and it’s one thing that you do better than your peers, and so it puts individuals in the driver’s seat in terms of future success,” Arnce concludes.
Students scan patients and the AI labels for them what they’re seeing, so they can literally teach themselves.
Dr Robert Arnce
Caption. Credit:

Phillip Day. Credit: Scotgold Resources
Total annual production
Australia could be one of the main beneficiaries of this dramatic increase in demand, where private companies and local governments alike are eager to expand the country’s nascent rare earths production. In 2021, Australia produced the fourth-most rare earths in the world. It’s total annual production of 19,958 tonnes remains significantly less than the mammoth 152,407 tonnes produced by China, but a dramatic improvement over the 1,995 tonnes produced domestically in 2011.
The dominance of China in the rare earths space has also encouraged other countries, notably the US, to look further afield for rare earth deposits to diversify their supply of the increasingly vital minerals. With the US eager to ringfence rare earth production within its allies as part of the Inflation Reduction Act, including potentially allowing the Department of Defense to invest in Australian rare earths, there could be an unexpected windfall for Australian rare earths producers.