Showing posts with label technology risks. Show all posts
Showing posts with label technology risks. Show all posts

Monday, January 15, 2018

"Hot Spot For User Entry Error": Hawaii missile alert: How one employee ‘pushed the wrong button’ and caused a wave of panic

A short post.

I believe this WaPo story vividly demonstrates issues I've seen in what Australian colleague Dr. Jon Patrick & I call "bad health IT." 

We came up with the simple-to-comprehend terminology "Good health IT/Bad health IT" in his living room in Sydney after my presentation to the Health Informatics Society of Australia in 2012 on health IT trust (http://hcrenewal.blogspot.com/2012/08/my-presentation-to-health-informatics.html), to replace my earlier terms "health IT done well" vs. "done poorly."

It was not just "one employee who pushed the wrong button."  A team of apparently incompetent IT personnel and utterly incompetent IT managers - completely devoid of any understanding of human-computer interaction - were, in essence, standing behind this this employee and guiding his hand.

The Hawaii mishap vividly demonstrates bad IT in the most critical of settings - badly conceived, designed & implemented, lacking appropriate safeguards, usually by people who do not know the domain, and often who are, dare I say, lacking common sense.

Questions that the incident raises include:
  • How [in God's name] were such critical items as "Test missile alert" and "Missile alert" (the real thing) residing in the same menu?  Who came up with such bad, terse labeling as well?  [A "hot spot" for user entry error] 
  • Why were there no reasonable safeguards? 
  • Why was it easy for anyone to make a big mistake?
  • Why was no system in place for rapid retraction?

The answers translate back to - well, I probably don't need to say it.




Hawaii missile alert: How one employee ‘pushed the wrong button’ and caused a wave of panic
Washington Post

... Around 8:05 a.m., the Hawaii emergency employee initiated the internal test, according to a timeline released by the state. From a drop-down menu on a computer program, he saw two options: "Test missile alert" and "Missile alert."

This is  a classic example of what can be called a "hot spot for user entry error."

... He was supposed to choose the former; as much of the world now knows, he chose the latter, an initiation of a real-life missile alert.

... "Based on the information we have collected so far, it appears that the government of Hawaii did not have reasonable safeguards or process controls in place to prevent the transmission of a false alert," Pai said in a statement.

... Part of what worsened the situation Saturday was that there was no system in place at the state emergency agency for correcting the error, Rapoza said...."In the past there was no cancellation button. There was no false alarm button at all,"

.... "Part of the problem was it was too easy - for anyone - to make such a big mistake," Rapoza said. "We have to make sure that we're not looking for retribution, but we should be fixing the problems in the system.

It would not be unreasonable to predict that U.S. armed forces were put on alert, perhaps even scrambling fighter planes near the Korean peninsula - moves that other countries could detect.

This "mishap" could have caused N. Korea or other hostile country to react, and led to catastrophe.




The utterly incompetent IT personnel and their utterly incompetent managers who birthed such a cornucopia of IT atrocities should be severely punished.

-- SS

Jan. 16, 2018.  Update, update.  Who's got the button?

This new WaPo story shows the "unholy" menu - a jumbled mess.


https://www.washingtonpost.com/news/morning-mix/wp/2018/01/16/that-was-no-wrong-button-in-hawaii-take-a-look/?utm_term=.147cc4852c8e


Was this designed by an expert in human-computer interaction?  I think not...


Incredible.

-- SS


Wednesday, December 19, 2012

A Significant Additional Observation on the PA Patient Safety Authority Report "The Role of the Electronic Health Record in Patient Safety Events" -- Risk

At a Dec. 13, 2012 post "Pennsylvania Patient Safety Authority: The Role of the Electronic Health Record in Patient Safety Events" I alluded to risk in a comment in red italics:

... Reported events were categorized by their reporter-selected harm score (see Table 1). Of the 3,099 EHR-related events, 2,763 (89%) were reported as “event, no harm” (e.g., an error did occur but there was no adverse outcome for the patient) [a risk best avoided to start with, because luck runs out eventually - ed.], and 320 (10%) were reported as “unsafe conditions,” which did not result in a harmful event. 

The focus of the report is on how the "events" did not cause harm.  Thus the relatively mild caveat:

"Although the vast majority of EHR-related reports did not document actual harm to the patient, analysts believe that further study of EHR-related near misses and close calls is warranted as a proactive measure."

It occurs that if the title of the paper had been "The Role of the Electronic Health Record in Patient Safety Risk", the results might have been interpreted far differently:

In essence, from from June 2, 2004, through May 18, 2012 (the timeframe of the Pennsylvania Patient Safety Reporting System or PA-PSRS database), from a dataset highly limited in its comprehensiveness as written in the earlier post, there were approximately 3,000 "events" where an error did occur that potentially put patients at risk.

That view - risk - was not the focus of the study.  Should it have been?

These "events" really should be called "risk events."

It is likely the tally of risk events, if the database were more comprehensive (due to better recognition of HIT-related problems, better reporting, etc.) would be much higher.  So would the reports of "harm and death" events as well.

That patient harm did not occur from the majority of "risk events" was through human intervention, which is to say, luck, in large part

Luck runs out, eventually.

I have personally saved a relative several times from computer-related "risk events" that could have caused harm if I were not there personally, and with my own medical knowledge, to have intervened.  My presence was happenstance in several instances; in fact a traffic jam or phone call could have caused me to have not been present.

What's worse, the report notes:

Analysts noted that EHR-related reports are increasing over time, which was to be expected as adoption of EHRs is growing in the United States overall.

In other words, with the current national frenzy to implement healthcare information technology, these "risk events" - and "harm and death events" - counts will increase.  My concern is that they will increase significantly.

I note that health IT is likely the only mission-critical technology that receives special accommodation regarding risk events.  "If the events didn't cause harm, then they're not that important an issue" seems to be the national attitude overall.

Imagine aircraft whose avionics and controls periodically malfunction, freeze, provide wrong results, etc., but most are caught by hyper-vigilant pilots so planes don't go careening out of control and crash.  Imagine nuclear plants where the same occurs, but due to hypervigilance the operators prevent a nuclear meltdown.

Then, imagine reports of these "risk events" - based on fragmentary reporting of pilots and nuclear plant operators reluctant to do so for fear of job retaliation - where the fact of their occurrence takes a back seat to the issue that the planes did not crash, or Three Mile Island or Chernobyl did not reoccur.

That, in fact, seems to be the culture of health IT.

I submit that the major focus that needs addressing in health IT is risk - not just confirmed body counts.

-- SS

Saturday, September 01, 2012

Two recent interesting settlements at Massachusetts General Hospital (MGH), both involving technology

Two recent interesting settlements at Massachusetts General Hospital (MGH), both involving technology.

The first case involved a medication error (from a 'miscommunication between doctors and nurses', an infusion pump snafu, and failure to perform obvious follow up labs;  if health IT was involved it would not surprise me).  The second case involved alarm fatigue.

These amounts are interesting considering the age and condition of the patients.


1.  http://www.lubinandmeyer.com/cases/medication-error.html


Medication Error Lawsuit against MGH Settles for $1.25 Million

The plaintiff’s decedent was a 76-year-old woman who died on 11/24/10 from a hemorrhage. Her death occurred following a preventable medication error involving the drug Lepirudin. The patient was given over 30 times too much medication which resulted in uncontrollable internal bleeding and her subsequent death.

Her past medical history included cirrhosis with well preserved hepatocellular synthetic function. She also had Type 2 diabetes, hypertension and hypercholesterolemia, and a history of splenectomy for treatment of severe thrombocytopenia.


and


2.  http://www.masslive.com/news/index.ssf/2011/11/mass_general_hospital_alarm_fa.html


Mass. General Hospital 'alarm fatigue' lawsuit settled for $850,000

BOSTON (AP) — The family of an 89-year-old man who died at Massachusetts General Hospital when nurses did not respond to alarms on his cardiac monitor has settled its case against the hospital for $850,000.

I see potential lessons for at least two healthcare stakeholders in these cases:

Hospital executives:  bad technology is not your friend.  get it right before rolling it out, with robust, validated safeguards, to save lives - and to save your organizations from costly litigation and reputational damage.

Clinicians:  bad technology is your enemy.  While hyper-vigilance is mentally exhausting, that's what's required to avoid the fate of the patients - and the clinicians - in the above cases.

Reporting bad technology and making sure the problems are remediated promptly, not glossed over, is equally essential.

Note: my interpretation is that both technology and people issues probably played a role in both these accidents, based on my own knowledge and experience, but that is of course a personal opinion. 

-- SS