
In the high-stakes, caffeine-fueled world of medical dramas, we usually watch for the “miracle saves,” conflicts, and romance among hospital staff. The Pitt, a TV series on HBO, has been praised for its realistic depiction of a chaotic emergency room at a major urban trauma center. With more conflict and trauma than romance.
But for those of us navigating the complexities of healthcare compliance, Season 2 of The Pitt also offers realistic portrayals of the challenges of HIPAA compliance.
Set in the fictional, underfunded Pittsburgh Trauma Medical Center (PTMC), the show’s second season isn’t just about treating trauma; it examines the modern challenges of healthcare costs for patients and how staff must navigate federal regulations in a digital age. From the advanced risks of Artificial Intelligence to the resilience required during a cyberattack.
Here’s what the show gets right (and wrong) about HIPAA in three Season 2 episodes.
Note: A few spoilers ahead, through Episode 7.
AI Hallucinations and Patient Safety (Episode 2)
In Episode 2, PTMC tackles a common challenge for doctors: the long hours needed to finish a day’s work. This includes the unpaid time clinicians spend at home completing patient charts. Dr. Baran Al-Hashimi presents an ambient AI medical scribing tool. The innovation is groundbreaking—the AI “listens” to the doctor-patient conversation, filters out small talk, and creates a professional medical note directly in the Electronic Health Record (EHR) system.
The Drama in the Pit
The tension rises when a resident physician reviews a chart generated by the tool and notices a serious error: the AI suggested a medication that was never discussed and was actually contraindicated for the patient’s condition. This wasn’t just a typo; it was a “hallucination”—where an AI model confidently produces incorrect or fabricated information.
What HIPAA Says
Under the HIPAA Security Rule, preserving the integrity of Protected Health Information (PHI) is essential. Integrity ensures that PHI remains accurate and is not modified or lost. When an AI tool “hallucinates” a medication, it generates a record that is inherently inaccurate.
In clinical settings, these hallucinations aren’t just technical glitches; they pose safety risks that could lead to significant HIPAA and malpractice liabilities. If a provider signs off on an AI-generated note without catching the error, they are effectively certifying that the information is accurate.
Managing the Risk
As we discussed in our previous look at AI in healthcare, there isn’t a clear “right or wrong” when it comes to AI use. The “wrong” is in blind adoption.
To stay compliant, providers must:
Maintain Oversight: HIPAA mandates a “human-in-the-loop.” Clinicians must verify each AI-generated note for accuracy.
Establish Policy: Organizations should have clear guidelines on when and how to use AI scribes.
Execute a BAA: Any AI vendor handling PHI must sign a Business Associate Agreement (BAA), legally binding them to HIPAA’s privacy and security standards.
Parkour, Privacy, and the Power of Consent (Episode 4)
Episode 4 presents a very modern dilemma. A patient arrives with injuries from performing a parkour stunt for social media. His friend, the videographer, follows him into the treatment room with a camera rolling, determined to capture the “drama” of the ER for their followers.
The Clash
Dr. Michael “Robby” Robinavitch immediately shuts it down, telling the friend, “Whoa, you can’t film in here, we’ve got patient privacy laws.”
What’s Wrong: HIPAA is more flexible than many realize. If the friend is filming with the patient’s authorization (which is implied since they were filming the parkour together), the doctor doesn’t strictly need a signed authorization. Under the HIPAA Privacy Rule (45 CFR § 164.510(b)), a provider may disclose PHI to a friend if the patient is present and does not object, or if the patient is unconscious but the circumstances suggest that the patient would consent.
What’s Right: The doctor is erring on the side of caution. He isn’t sure whether the patient agrees to being filmed, even though his friend is with him. Also, in a busy ER, a camera might record more than just one patient; it could also capture other patients’ faces, monitors, and charts in the background. Protecting everyone’s privacy in the ER is the hospital’s responsibility.
The Verdict
Who is right? Most compliance officers would back Dr. Robby. While the patient might want the content, the doctor’s priority is the integrity of the clinical environment. By waiting until the patient can speak for himself and ensuring that no other patient’s privacy is compromised, Dr. Robby follows the best practice of “minimum necessary” disclosure.
The “Analog” Nightmare: Cyberattacks (Episode 7)
The tension escalates in Episode 7. The staff learns that Westbridge Hospital, a nearby facility, has experienced a major cyberattack, causing its systems to shut down and diverting patients to nearby facilities. The surge of patients from Westbridge is overwhelming PTMC.
The Digital Blackout
The CEO arrives in the ER with a warning: PTMC is probably the next target. He also mentions a possible ransom that Westbridge might pay to expedite its recovery from the attack. He advises that, as a precaution, the IT team plans to shut down PTMC’s systems before another attack occurs. He lists “patient registration, electronic health records, lab and radiology interface, email, and Internet.” While he’s speaking, all the screens suddenly go dark.
What HIPAA Says
This episode perfectly illustrates the HIPAA Security Rule’s Administrative Safeguards, specifically Contingency Planning (164.308(a)(7)). HIPAA requires every covered entity to have:
A Data Backup Plan: To ensure PHI isn’t lost forever.
A Disaster Recovery Plan: To restore systems after an attack.
An Emergency Mode Operation Plan: This is what we see on screen—how the hospital continues to function when the “lights” go out.
Quick Thinking and the Risk
In a moment of quick thinking, a resident snaps a photo of the “Board” (the current list of patients and their statuses) just before the screens go black.
While this was a smart move to help keep the ER operational, it introduces a significant HIPAA risk. The photo now contains unencrypted PHI stored on a personal mobile device. If the phone is lost or if the photo is automatically backed up to a personal cloud, PTMC could face a data breach. It highlights the vulnerability of the healthcare system: even well-intentioned “workarounds” during a crisis can lead to non-compliance.
The Pitt and HIPAA Lessons
Season 2 of The Pitt succeeds because it doesn’t see HIPAA as a boring administrative hurdle or just a theoretical issue. Instead, it shows HIPAA as the core of patient trust and safety, embedded in real-world situations.
Whether it’s the high-tech risk of an AI hallucination, the modern nuisance of “filming for the ‘gram,” or the existential threat of a ransomware attack, the show reflects the real-world challenges healthcare providers face every day.
The Takeaway for Real-World Providers
- AI is a tool, not a doctor. Always verify AI-generated notes to prevent “hallucinations” from becoming malpractice.
- Consent is contextual. Respect patient autonomy, but also safeguard the privacy of the “background” patients who didn’t agree to be in someone else’s video.
- Prepare for the “Analog” day. Have a contingency plan that guides staff on how to manage without electronic communications and doesn’t rely on staffers taking photos of hospital data with their personal phones.
The Pitt reminds us that in the ER, every second counts — but so does every piece of protected health information belonging to patients in the hospital’s care.

