Healthcare workers prioritize helping people over information security (disaster ensues)

In Workarounds to Computer Access in Healthcare Organizations: You Want My Password or a Dead Patient?, security researchers from Penn, Dartmouth and USC conducted an excellent piece of ethnographic research on health workers, shadowing them as they moved through their work environments, blithely ignoring, circumventing and sabotaging the information security measures imposed by their IT departments, because in so doing, they were saving lives.

For example, passwords were so commonly written on sticky notes and placed on terminals that they formed "stalactites," and in some hospitals, all workers shared a single password, which was written on a piece of tape stuck to the device -- to solve this, one vendor offers stickers "to write your username and password and post on your computer monitor."

Health workers were also skilled at defeating the proximity sensors that logged them out of their terminals when they got up from their workstations (these automated logouts are vital to ensuring that clinicians check that the record they're looking at is for the patient they're treating, preventing potentially fatal mixups): for example, by styrofoam cups over the sensors or assigning the most junior staffer to press the spacebar at timed intervals.

These workarounds were driven by clinicians' need to get their jobs done and by IT's failure to understand what that entailed. For example, IT's imposition of password rotation schedules meant that no one knew what their passwords were from moment to moment, forcing them to write them down and share them (in some cases, IT might have had this policy set by vendors or regulators/insurers). Aggressive timeouts on terminals meant that clinicians spent an undue amount of time logging in, making it impossible to get their work done.

Other IT-based checks forced even-more-dangerous workarounds, like the system that wouldn't let doctors save work without ordering potentially lethal blood thinners, which they'd have to remember to log back in and cancel, or kill their patients. A thumbprint-based signing system for death certificates only accepted thumbprints from one doctor, meaning that his signature was on every death certificate, regardless of whose patient the deceased had been.

Some of these problems may be insurmountable, but the point of these anaecdotes is to show that bad solutions can make things worse -- and what's more, leave people with the assumption that the job is done, and even cause bad anti-solutions to propagate to other places (after all, if it's "working" in one place, it'll work somewhere else).

IT departments often have abstinence-based security systems: "Don't do that," or, even worse, "You only think you need to do that." The lesson of this paper is that abstinence-based security only drives the undesirable conduct underground, where it is even more dangerous and uncontrolled.

This has been the season of ransomware attacks on hospitals, with whole hospitals being held to ransom by anonymous creeps demanding millions. Securing those hospitals is job one, but it won't be accomplished by telling the people saving lives that they're wrong to want to get their jobs done -- only by collaborating with and listening to the people on the front lines can IT enlist their support. This is exactly the kind of research we need to start fixing the dumpster-fire that is IT security.

As in other domains, clinicians would also create shadow systems operating in parallel to the health IT. Doctors have “shadow notes.” Nurses have the “nurse’s brain:” a single page with all one’s tasks for all of one’s patients. “You’d be lost without it, e.g., “at 2:00 I need to do this, later I need to do this, mother is nasty so don’t answer phones from her, etc.” “Occasionally, information in the brain is not information you want in the formal record.” Nurses are told to discard paper notes not in the electronic system. A dental hygienist enthusiastically reported keeping a shadow dental record when computer systems did not allow for the desired level of precision.

At one hospital, nurses in pre-op need to physically move patients to the OR, which is 2 minutes away. It's important to the OR people that the time of the transfer into the OR is accurately recorded (to the minute). But the patient record (and the EMR portal) is at pre-op, not in the doorway to the OR. When the hospital had a paper-based EMR, the nurses would enter “current time + 2 min” into the paper record before rolling the patient down the hall. However, the new EMR does not allow future times; consequently, the nurses leave themselves logged in but turn the monitor off-- and then come back to the pre-op afterward and record the OR transfer time.

Workarounds to Computer Access in Healthcare Organizations: You Want My Password or a Dead Patient? [Koppel, Smith, Blythe and Kothari/Pubmed]

Study finds Password Misuse in Hospitals a Steaming Hot Mess [Paul/Security Ledger]

(Image: A central computer system monitors the heart rates of each patient in the Intensive Care Unit (ICU), US Navy, PD)

Loading...