I believe this WaPo story vividly demonstrates issues I've seen in what Australian colleague Dr. Jon Patrick & I call "bad health IT."
We came up with the simple-to-comprehend terminology "Good health IT/Bad health IT" in his living room in Sydney after my presentation to the Health Informatics Society of Australia in 2012 on health IT trust (http://hcrenewal.blogspot.com/2012/08/my-presentation-to-health-informatics.html), to replace my earlier terms "health IT done well" vs. "done poorly."
It was not just "one employee who pushed the wrong button." A team of apparently incompetent IT personnel and utterly incompetent IT managers - completely devoid of any understanding of human-computer interaction - were, in essence, standing behind this this employee and guiding his hand.
The Hawaii mishap vividly demonstrates bad IT in the most critical of settings - badly conceived, designed & implemented, lacking appropriate safeguards, usually by people who do not know the domain, and often who are, dare I say, lacking common sense.
- How [in God's name] were such critical items as "Test missile alert" and "Missile alert" (the real thing) residing in the same menu? Who came up with such bad, terse labeling as well? [A "hot spot" for user entry error]
- Why were there no reasonable safeguards?
- Why was it easy for anyone to make a big mistake?
- Why was no system in place for rapid retraction?
... Around 8:05 a.m., the Hawaii emergency employee initiated the internal test, according to a timeline released by the state. From a drop-down menu on a computer program, he saw two options: "Test missile alert" and "Missile alert."
This is a classic example of what can be called a "hot spot for user entry error."
... He was supposed to choose the former; as much of the world now knows, he chose the latter, an initiation of a real-life missile alert.
... "Based on the information we have collected so far, it appears that the government of Hawaii did not have reasonable safeguards or process controls in place to prevent the transmission of a false alert," Pai said in a statement.
... Part of what worsened the situation Saturday was that there was no system in place at the state emergency agency for correcting the error, Rapoza said...."In the past there was no cancellation button. There was no false alarm button at all,"
.... "Part of the problem was it was too easy - for anyone - to make such a big mistake," Rapoza said. "We have to make sure that we're not looking for retribution, but we should be fixing the problems in the system.
It would not be unreasonable to predict that U.S. armed forces were put on alert, perhaps even scrambling fighter planes near the Korean peninsula - moves that other countries could detect.
This "mishap" could have caused N. Korea or other hostile country to react, and led to catastrophe.
The utterly incompetent IT personnel and their utterly incompetent managers who birthed such a cornucopia of IT atrocities should be severely punished.
Jan. 16, 2018. Update, update. Who's got the button?
This new WaPo story shows the "unholy" menu - a jumbled mess.
|Was this designed by an expert in human-computer interaction? I think not...|