Breaking
Bloomberg Hong Kong Airport Taps Local Debt Boom With $1.9 Billion Plan 7h Bloomberg Apollo Acquires Forvia’s Auto Interiors Unit for €1.82 Billion 7h Bloomberg FTSE 100 Futures Fall, Pound Steadies as Iran Plan in Focus 7h Bloomberg Fed, BOJ, Bank of Canada Expected to Hold Rates 7h BTN GPT-5.5 Drops The Chatbot Act: OpenAI Bets $20B On Work Agents 7h Ft Goldman Sachs raises oil price forecast as war disruption drags on  7h BTN Project Maven Just Became a Program of Record. The Ethics Fight is Far From Over 7h LNC AI Air Quality: How Your ChatGPT Queries Hurt Local Air 8h Ft Prosecute Russian-sponsored saboteurs, Estonia tells Europe 9h Ft How ‘conflict entrepreneurs’ are inflaming US political violence 9h Ft Leading central banks play for time on interest rate rises 9h BTN Oil Surges Past $107 as Iran Peace Talks Collapse in Pakistan 9h BTN Samsung Stakes $2.8B on Smart Glasses While Meta Dominates the Market 10h BTN White House Correspondents’ Dinner Shooting Exposes Washington’s Surreal Divide 10h BTN Gunman Storms White House Correspondents’ Dinner in Assassination Attempt 12h LNC Thinking of Moving Abroad? Start With a Digital Nomad Visa in 2026 16h LNC Top AI Music Visualizer Software in 2026: How AI Music Visualizer Software Is Shaping the Future of Real-Time Content 16h LNC Coinremitter’s Innovative Features for More Accessible Crypto Payments 16h Bloomberg Hong Kong Airport Taps Local Debt Boom With $1.9 Billion Plan 7h Bloomberg Apollo Acquires Forvia’s Auto Interiors Unit for €1.82 Billion 7h Bloomberg FTSE 100 Futures Fall, Pound Steadies as Iran Plan in Focus 7h Bloomberg Fed, BOJ, Bank of Canada Expected to Hold Rates 7h BTN GPT-5.5 Drops The Chatbot Act: OpenAI Bets $20B On Work Agents 7h Ft Goldman Sachs raises oil price forecast as war disruption drags on  7h BTN Project Maven Just Became a Program of Record. The Ethics Fight is Far From Over 7h LNC AI Air Quality: How Your ChatGPT Queries Hurt Local Air 8h Ft Prosecute Russian-sponsored saboteurs, Estonia tells Europe 9h Ft How ‘conflict entrepreneurs’ are inflaming US political violence 9h Ft Leading central banks play for time on interest rate rises 9h BTN Oil Surges Past $107 as Iran Peace Talks Collapse in Pakistan 9h BTN Samsung Stakes $2.8B on Smart Glasses While Meta Dominates the Market 10h BTN White House Correspondents’ Dinner Shooting Exposes Washington’s Surreal Divide 10h BTN Gunman Storms White House Correspondents’ Dinner in Assassination Attempt 12h LNC Thinking of Moving Abroad? Start With a Digital Nomad Visa in 2026 16h LNC Top AI Music Visualizer Software in 2026: How AI Music Visualizer Software Is Shaping the Future of Real-Time Content 16h LNC Coinremitter’s Innovative Features for More Accessible Crypto Payments 16h
SPXNDXDJIBTCETHOILGLD10YGOOGAAPLNVDATSLAMSFTMETASOLXRPLINKLTCDOTBNBSPXNDXDJIBTCETHOILGLD10YGOOGAAPLNVDATSLAMSFTMETASOLXRPLINKLTCDOTBNB
Home Healthcare Technology

California Patients Sue Over AI Medical Recording Without Consent

California patients sue Sutter Health and MemorialCare over AI recording medical visits without consent, raising critical privacy concerns about healthcare technology

California Patients Sue Over AI Medical Recording Without Consent
California Patients Sue Over AI Medical Recording Without Consent

Three California patients discovered their most intimate medical conversations were being secretly recorded and transmitted to artificial intelligence systems. Now they're fighting back with lawsuits that could reshape how healthcare uses AI technology.

The plaintiffs – Christina Washington, Dennis Gueretta, and Rebecca Matulic – filed a proposed class action in the U.S. District Court for the Northern District of California against Sutter Health, Memorial Health Services, and MemorialCare Medical Foundation after learning their medical appointments were recorded by Abridge AI's transcription tool without their knowledge or consent.

The case exposes a fundamental tension in modern medicine: healthcare providers desperate to reduce burnout are embracing AI documentation tools, while patients are discovering their most private health discussions are being captured, transmitted, and processed by third-party systems they never agreed to.

The Technology Recording Your Doctor Visits

AI medical scribes like Abridge work by recording entire patient-doctor conversations through microphones on providers' devices. The platform captures the substance of what the patient and physician say, converts the recorded speech into text and applies AI models to generate structured clinical documentation.

According to a new study from the American Medical Association (AMA), 66 percent of physicians reported using artificial intelligence (AI) tools or systems at work in 2024, representing a 78 percent increase in use from 2023. Among the most quickly adopted AI applications has been medical scribes — AI systems designed to automatically document patient encounters, generate clinical notes, and assist with medical documentation. These tools promise to ease physician stress and burnout by reducing time spent on administrative tasks, while improving documentation quality.

But here's what happens behind the scenes: The recorded conversations and transcripts are "transmitted outside the clinical environment and retained, stored, or otherwise processed on infrastructure associated with the Abridge platform," according to the lawsuit. This includes intimate health information captured by the tool includes symptoms, medical history, diagnoses, medications and treatment plans, mental health disclosures and family medical history.

The Legal Battle: More Than Just California Privacy Laws

The California lawsuits allege violations of multiple state and federal laws:

  • California Invasion of Privacy Act (CIPA), California Confidentiality of Medical Information Act (CMIA), California Unfair Competition Law, Federal Wiretap Act, and constitutes invasion of privacy – intrusion upon seclusion
  • Under California law, KPBS said all parties must consent before sensitive conversations—such as those that take place in a healthcare setting—can be recorded for any purpose

What makes these cases particularly explosive is a disturbing detail from the Sharp HealthCare lawsuit: Perhaps most concerning, Saucedo said the notes contain confirmations that patients were advised about the recordings and consented—affirmations that appear to have been added by the AI itself.

The scale is staggering. Attorneys representing the plaintiff said they estimate 100,000 patient encounters have been recorded since the rollout of Abridge. Abridge AI's platform is used by many large health systems and providers, including Johns Hopkins, Mayo Clinic, Mount Sinai Medical Center, UC Health, MemorialCare, Christus Health, Corewell Health, and Reid Health.

The Privacy Minefield of AI Healthcare

The rush to implement AI scribes has created what privacy experts call a perfect storm of risk. When cybercriminals target AI systems, they can potentially access not just stored data but also live patient interactions. The 190 million records exposed in the largest healthcare data breach of 2024 – the Change Healthcare ransomware attack – demonstrates the massive scale of potential exposure. The challenge becomes even greater when considering that AI systems often integrate with multiple healthcare platforms. A breach in one system can cascade across entire healthcare networks, potentially exposing patient data from multiple sources simultaneously.

Beyond security breaches, there's the unsettling question of what happens to all this data. The vast repositories of patient conversations generated by these systems create valuable datasets for AI development and research, yet patients providing clinical information to address specific health problems may not expect their data to be used for algorithm training or commercial AI development. This unconsented secondary use risks eroding patient trust, particularly among communities with historical experiences of medical exploitation. The challenge is compounded when aggregated patient data from AI scribes is used to develop new AI products or sold to third parties, creating economic value from patient interactions without explicit consent for such commercialization.

Healthcare's HIPAA Loophole

Here's where it gets legally murky. Since the information collected, transmitted, and processed by the platform at the direction of its clients is related to healthcare operations, patient consent is not required by HIPAA, provided the healthcare organization has a HIPAA-compliant business associate agreement with Abridge AI. The lawsuit does not allege that HIPAA has been violated but does assert that the interception, recording, and transmission of sensitive communications and health information without patients' express consent violates the federal Wiretap Act and state consumer privacy laws.

This creates a troubling gap. While HIPAA allows healthcare providers to share patient data with business associates for operational purposes without explicit consent, state recording laws often require all parties to consent before conversations can be recorded. Healthcare providers find themselves caught between federal regulations that permit data sharing and state laws that prohibit recording without consent.

What Privacy Advocates Are Saying

Privacy experts are sounding alarms about the rapid adoption of AI scribes without adequate safeguards. "Data governance is paramount," said attorney Lee Kim, founder of consulting firm Keytera and former longtime cybersecurity and privacy principal at the Healthcare Information Management and Systems Society. "Entities need to understand what data is being captured, whether it is necessary, how it is processed, who can access it and how long it is retained, including for the organization and vendors and downstream entities," she said.

"From a privacy standpoint, a notice needs to be posted and patients must be given the ability to opt-out of having their conversation transcribed," said regulatory attorney Rachel Rose. "If patients are recorded without their knowledge or consent, then it is potentially both a HIPAA and a state law privacy violation. Additionally, because the conversation morphs into a written record, the Security Rule would also apply," she said.

The Canadian experience offers a cautionary tale. On December 17, 2024, an Ontario hospital notified the Privacy Commissioner of Ontario that a privacy breach had occurred when a former hospital physician's AI scribe joined and recorded a virtual hepatology rounds meeting. The AI tool used transcribes spoken words into text and can integrate with various virtual meeting platforms by accessing invitations stored in users' digital calendars – in this case, this is how the AI scribe joined the meeting despite the fact that the physician was no longer employed at the hospital. The personal health information of seven patients was discussed during the meeting and captured in the recorded transcript. The information included patient names, sex, physician names, diagnoses, medical notes, and treatment information.

The Human Cost of Innovation

Beyond legal concerns, there are profound questions about the doctor-patient relationship. Because AI medical scribe notes are based on audio recordings of patient-physician conversations, there is a built-in potential for breaches of patient privacy. These conversations may include mentions or confirmations of HIPAA-protected patient information, such as patient name, birthdate, address, contact information, family member names and contact information, pre-existing conditions, or other protected health information (PHI).

The technology also raises accuracy concerns. The swift adoption of AI medical scribes raises concerns about their accuracy and the lack of independent oversight. Studies indicate that AI scribes may miss crucial non-verbal cues in conversations, leading to incomplete or inaccurate medical records. AI scribes, especially those built on generative models, can "hallucinate" or fabricate clinical information. Worse, they can misattribute information to the wrong patient if transcription errors or patient mismatches occur.

Your Rights as a Patient

So what can patients do? First, understand that you have rights:

  1. Ask explicitly whether AI recording tools will be used during your visit
  2. Request to opt outHow patients can opt out of using the AI scribe tool during their visits should be clearly explained
  3. Document your preferences in writing with your healthcare provider
  4. Inquire about data handling – where recordings go, how long they're kept, who has access
  5. Know your state lawsMany states regulate the recording of private conversations. These laws vary; some require the consent of all parties, while others require only one party's consent. Because violations can carry civil or criminal penalties, providers should ensure they meet applicable state consent requirements before recording any visit

The Path Forward

The California lawsuits represent just the beginning of what could become a nationwide reckoning over AI in healthcare. According to the complaint, "Defendants implemented the AI recording system without obtaining meaningful, informed consent from patients prior to recording and transmitting their medical conversations." California is becoming the clearest test case for how fast-moving healthcare AI collides with older privacy law. The issue is that its newer AI laws focus more on disclosure and deceptive representation than on exam-room audio capture itself, so lawsuits are leaning on older laws like the California Invasion of Privacy Act and the California Confidentiality of Medical Information Act to fill the gap.

For healthcare providers, the message is clear: As AI scribes become more common in clinical care, providers must ensure compliance with HIPAA and relevant state laws. A valid business associate agreement, a proper security risk analysis, and clear, documented patient consent are essential.

Patient comfort and trust must remain central when using this new technology. As regulatory measures evolve across jurisdictions, providers need to stay abreast of practices and procedures to safeguard patient data and maintain accountability.

The technology companies building these systems also face scrutiny. AI scribes are transforming clinical documentation. However, with great automation comes great accountability, especially under HIPAA. Vendors, digital health companies, and health systems must treat AI scribes not just as software, but as data stewards embedded into patient care. By building strong contractual safeguards, limiting use of PHI to what is permitted under HIPAA, and continuously assessing downstream risks, digital health leaders can embrace innovation without inviting unwarranted risk.

As more healthcare systems adopt AI documentation tools, the fundamental question remains: Can we harness technology to reduce physician burnout without sacrificing the privacy and trust that form the foundation of medical care? The answer will shape not just legal precedent, but the future of the doctor-patient relationship itself.

Patients discovering their medical conversations were recorded without consent aren't just filing lawsuits – they're demanding accountability in an era where the line between healthcare innovation and privacy invasion has never been thinner. The outcome of these cases could determine whether AI enhances healthcare or erodes its most sacred principle: trust between doctor and patient.