In 2022, we concluded our HIPAA analysis with a forward-looking prediction: "As healthcare becomes increasingly digital, we expect HHS to release additional guidelines helping covered entities adopt new digital technologies — including AI-driven diagnostics and blockchain-based health records — within the HIPAA framework." We got the direction right. We severely underestimated the speed, scale, and disruptive force of what "AI-driven" would actually mean by 2026.
The clinical AI transformation of the past four years has not been incremental. It has been architectural. Large language models now sit inside electronic health record systems, ambient microphones listen to clinical encounters and generate structured documentation in real time, AI tools read radiology images and pathology slides, and foundation models trained on millions of de-identified patient records are embedded in hospital decision support systems across the country. HIPAA's 1996 architecture was not designed for any of this — and regulators, covered entities, and patients are all navigating the gap in real time.
The most clinically transformative — and HIPAA-sensitive — development of the past four years has been ambient clinical intelligence: AI systems that listen to the patient-provider encounter, transcribe it in real time, extract clinically relevant information, and generate a structured note that the physician reviews and signs. In 2022, this technology was nascent. By 2026, it is mainstream clinical infrastructure.
Microsoft's Nuance DAX Copilot, embedded in Epic and Cerner EHRs, now processes tens of millions of clinical encounters annually across hundreds of health systems. Abridge, backed by Andreessen Horowitz and UPMC, has deployed across 150+ health systems. Suki AI and DeepScribe serve the ambulatory care market. Ambience Healthcare has focused on specialty care documentation where note complexity is highest.
These platforms sit squarely within HIPAA's purview. They are Business Associates that sign BAAs with covered entities. They process Protected Health Information (PHI) in real time — audio recordings of encounters, their transcriptions, and the clinical notes they generate. The HIPAA Privacy Rule's minimum necessary standard, access controls, and breach notification obligations all apply to every ambient AI platform operating in U.S. health systems today.
"HIPAA ensures patient privacy is safeguarded in an age of cloud technology and mobile devices while streamlining communication between providers." — The principle holds. The implementation challenge of applying HIPAA to ambient AI is orders of magnitude more complex than applying it to a cloud EHR or a telehealth platform.
If ambient AI represented the optimistic frontier of healthcare's digital transformation, the February 2024 ransomware attack on Change Healthcare — a UnitedHealth Group subsidiary processing approximately 40% of all U.S. medical claims — represented its most catastrophic failure. The breach exposed the Protected Health Information of an estimated 100-190 million Americans — potentially the largest healthcare data breach in U.S. history.
The Change Healthcare incident revealed a critical HIPAA structural weakness our 2022 analysis didn't address: the concentration of PHI processing in third-party clearinghouses that serve as systemic infrastructure for the entire U.S. healthcare economy. Change Healthcare was a Business Associate for virtually every major health system, insurer, and pharmacy in the country. Its breach simultaneously impaired claims processing, prior authorizations, pharmacy operations, and patient billing across the entire U.S. healthcare system for weeks.
HHS's 2024 response was significant: the Office for Civil Rights (OCR) proposed a major update to the HIPAA Security Rule — the first substantive revision since 2006 — requiring multi-factor authentication, network segmentation, vulnerability scanning, and incident response planning that go well beyond the original rule's "reasonable and appropriate" standard. The proposed Security Rule updates are scheduled for finalization in 2026 and will require covered entities and Business Associates to fundamentally rebuild their security programs.
The 2022 HIPAA piece anticipated AI's growing role but couldn't have anticipated the specific challenge that large language models create for the HIPAA framework. The core tension: LLMs are trained on data, and the most valuable data for clinical LLMs is patient data — but training on PHI triggers HIPAA's consent, de-identification, and authorization requirements.
HHS OCR issued guidance in 2024 clarifying that using PHI to train AI models requires either patient authorization or full de-identification meeting HIPAA's Safe Harbor standards. This has created significant legal complexity for health systems that want to train proprietary AI models on their patient populations. The result has been a bifurcated market:
De-identified data pipelines — companies like Truveta, Aetion, and Komodo Health have built large-scale de-identified clinical data platforms that allow AI training without HIPAA consent requirements. These platforms are now multi-hundred-million dollar businesses supplying the foundation model training pipelines of major health AI companies.
Foundation model licensing — rather than training their own models on PHI, many health systems are licensing foundation models (GPT-4, Claude, Gemini) through BAA-covered enterprise agreements, fine-tuning them on de-identified data, and deploying in HIPAA-compliant cloud environments. Microsoft Azure Healthcare, Google Cloud Healthcare API, and AWS HealthLake have each built HIPAA-eligible AI infrastructure specifically for this use case.
Our 2022 analysis did not address one of the most significant HIPAA gaps to emerge in the intervening years: consumer mental health applications. Platforms like BetterHelp, Talkspace, and hundreds of mental health apps operate outside HIPAA's coverage because they are not covered entities and their users are not "patients" in the legal sense — they are consumers entering direct-pay relationships. Yet these platforms collect extraordinarily sensitive mental health information and, in several documented cases, shared it with advertisers.
The FTC has stepped into the gap HIPAA left: BetterHelp paid $7.8 million in FTC settlements in 2023 for sharing user depression data with Facebook and Snapchat. The FTC's Health Breach Notification Rule was expanded in 2024 to cover health apps and wearables not covered by HIPAA. Congress has advanced the Health Data Use and Privacy Commission Act — which would create comprehensive health data privacy standards covering non-HIPAA entities. By 2026, the coverage gap between HIPAA and consumer health data is the most actively legislated area of health privacy law.
The HIPAA compliance and health data privacy market has expanded dramatically since 2022. Ambient AI platforms — Nuance DAX Copilot, Abridge, Suki, Ambience — are growing at 100%+ annually as health systems scramble to address physician burnout through AI documentation. HIPAA compliance infrastructure for AI — de-identification pipelines (Truveta, Komodo Health), BAA-covered cloud environments (Azure, Google Cloud, AWS HealthLake), and security monitoring (Claroty Healthcare, Medigate) — represents a large and growing B2B market. The proposed Security Rule updates will require capital investment across every health system in America. Investors in healthcare IT should treat the HIPAA Security Rule update not as a compliance burden but as a multi-year infrastructure spend cycle.