Event Recap – AI & HIPAA: Legal Challenges and Solutions for Medtech

May 15, 2025

On May 8, 2025, Gardner Law attorney Paul Rothermel led a timely webinar examining the intersection of artificial intelligence (AI), HIPAA, and privacy regulation in the medical technology sector. Drawing on deep expertise in privacy, cybersecurity, and healthcare compliance, Paul provided practical insights for companies navigating the legal and regulatory risks tied to AI use, development, and deployment. 

DOWNLOAD THE PRESENTATION SLIDES

The Expanding Role of AI in Healthcare

Paul opened the session by describing the expanding use of AI and machine learning (ML) in healthcare. AI technologies are now integral to diagnostics, treatment planning, clinical research, administrative support, and even automated decision-making (agentic AI). While these tools offer tremendous benefits—such as helping clinicians interpret large data sets and improving operational efficiency—they also raise important privacy, security, and compliance questions under the Health Insurance Portability and Accountability Act and HITECH Act (“HIPAA”) and other laws.

Managing PHI and HIPAA Compliance

A central theme of the discussion was how protected health information (PHI) is regulated under HIPAA when used in AI applications. Paul explained that PHI handled by covered entities and business associates is subject to strict rules limiting its use and disclosure. For companies looking to train AI models with health data, compliance options may include deidentifying data according to HIPAA standards, obtaining patient authorization, securing institutional review board (“IRB”) or privacy board waivers, or using limited data sets with data use agreements. Without these safeguards, AI projects risk violating HIPAA and triggering enforcement actions. Meanwhile, users of AI systems need to take steps to ensure PHI is not improperly disclosed to AI tools, including generative AI.

Paul emphasized the importance of early compliance planning:

“AI doesn’t exist in a regulatory vacuum,” Paul noted. “If you’re working with health data, it’s critical to understand whether you’re dealing with protected health information, whether you qualify as a covered entity or business associate, and how HIPAA and other privacy laws shape what you can and cannot do. Companies who develop or use AI tools without fully accounting for these legal boundaries may experience major headaches down the road.”

State Privacy Laws: Additional Layers of Complexity

Beyond HIPAA, Paul highlighted the growing complexity of state privacy laws, such as California’s Consumer Privacy Act and Washington’s My Health My Data Act. While many state laws exempt HIPAA-regulated data, gaps and overlaps persist—particularly for non-PHI health data or companies operating outside the traditional HIPAA framework. “It’s not always clear-cut which laws apply,” Paul cautioned, emphasizing the need for tailored legal analysis for each AI project.

Emerging AI Regulations: Spotlight on Colorado’s AI Act

The session also explored the rise of state-level AI regulations, with a focus on Colorado’s Artificial Intelligence Act, set to take effect in 2026. Paul outlined how this law imposes new requirements on developers and deployers of “high-risk AI systems,” including obligations to document training data, mitigate bias, increase transparency, and facilitate impact assessments. While the Act includes exemptions for certain HIPAA- and FDA-regulated activities, Paul recommended that medtech companies stay alert as more jurisdictions consider similar laws. 

Practical Strategies for Compliance and Risk Management

To close the session, Paul provided actionable strategies for mitigating privacy, security, and regulatory risks. Recommendations included establishing strong AI governance frameworks, conducting vendor diligence, embedding AI-specific protections in contracts and business associate agreements, and implementing internal policies and training around AI use and development. Paul stressed the importance of transparency when using AI in healthcare settings—ensuring AI supports, rather than replaces, healthcare provider decision-making and aligning any AI system development with the Food, Drug and Cosmetic Act and FDA regulations.

Key Takeaways for MedTech Innovators

Throughout the webinar, Paul shared valuable insights tailored to medical device manufacturers, software developers, privacy officers, legal counsel, and compliance leaders working at the forefront of AI in healthcare. His key message: as AI technologies evolve, so must organizations’ compliance strategies to ensure innovation aligns with legal, privacy, and regulatory expectations.

If you have questions about AI, HIPAA, or privacy compliance in the healthcare and medtech sectors, or would like to discuss your organization’s legal strategy, contact Paul Rothermel at prothermel@gardner.law.