Sunday, December 08, 2013

On Hypervigilance Due to Bad Health IT: "Texting While Doctoring: A Patient Safety Hazard"

An Opinion piece "Texting While Doctoring: A Patient Safety Hazard" appeared in the Annals of Internal Medicine of Christine A. Sinsky, MD and John W. Beasley, MD.  Dr. Sinsky is known to me to be what some would call a "heatlhcare IT iconoclast" (more accurately represented by the term "healthcare IT gadfly/realist" IMO).

In the piece the authors comment on the distractions caused by the technology, leading to doctors missing important cues in the exam room and to and impaired problem-solving.  This is part of a larger phenomenon that has been called "skill-degrading" or "de-skilling", e.g., see my April 16, 2010 post "Health Information Technology Basics From Calif. Nurses Association and National Nurses Organizing Committee" at  

These effects are likely to be further worsened as more and more clerical tasks such as order entry, the authors point out, get shifted to medical professionals.  To new readers: note that computerized order entry is often a complex and convoluted process; the CPOE systems are most decidedly NOT mere "typewriters for orders."  See, for instance, part 6 of my series on "Mission-hostile health IT" at

Most of the Annals article is available as a free preview at as of this writing and is worth reviewing.

Article preview, click to enlarge

I found one passage in particular striking, though.  In my ongoing discussions with computer scientist/informaticist/polymath Dr. Jon Patrick at U. Sydney ( , the issue of hypervigilance necessitated by bad health IT came up, and we arrived at the definition of same seen at my teaching site at

Bad Health IT ("BHIT") is defined as IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, is difficult and/or prohibitively expensive to customize to the needs of different medical specialists and subspecialists, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation. 

Note this passage in Dr. Sinsky and Beasley's opinion piece:

"I am always multitasking ... I am entering orders, checking labs, downloading information while I talk to the patient.  It requires chronic hypervigilance, which is exhausting and demands conscious effort to stay in the 'present' with the patient" (Day S., Personal communication.)  Click to enlarge.

I don't know if Dr. Day had seen my materials, but I suspect the exhausting hypervigilance is all too common, just not much publicized due to the secretive, closed, retaliatory-towards-whistleblowers nature of the healthcare IT sector.

I ask:  is this what we really want, in pursuit of some uncertain cybernetic miracle?

I note that the healthcare IT experiment (and the technology is experimental), long usurped from the Medical Informatics pioneers who trained me and put in the hands of commercial interests and those of a mercantile/manufacturing/management computing background, is increasingly a failure.

-- SS


Anonymous said...


The most surprising aspect of the Annals editorial piece was not the issues or facts contained within. Unless you belong to the HIT troika, consisting of corrupt politicians, crony capitalists and their Trotskyite cheerleaders (i.e. the Emanuel's of the medical establishment), there was no news to report. We already know that the current HIT implementation, especially, CPOE is garbage.

What got my attention was it being published in an ACP journal in the first place! Another pitiful organization that focuses on the most sublime aspects of medicine while maintaining an impotent silence to the evisceration of the profession by outside forces (see above),if not outright complicity.

Anonymous said...

In my experience, the care that is run by these systems can not be trusted.

My vigilance is 24/7, because weird dangerous stuff happens to my patients, that never ever happened prior to the deployment and activation of CPOE systems.

The systems are devices that must be regulated by the FDA, since they meet all definitions of a device in the FD and C Act.

I applaud Chris Sinsky et al for this astute simile.