Or was the problem the syndrome of inappropriate confidence in computers (SICC syndrome)?
I opine all of the above, in this cautionary tale:
Woman left to die after 999 ambulance blunder
By Laura Donnelly, Health Correspondent
Published: 9:00AM GMT 21 Mar 2010
An investigation into a woman’s death has exposed a catastrophic decision by ambulance chiefs which may have cost hundreds of lives.
The blunder arose when call centre staff were not warned of flaws with a computer system that prioritises emergencies before dispatching ambulances.
Bonnie Mason, 58, fell down the stairs and died from a head injury after 999 controllers in Suffolk failed to identify her situation as “life-threatening”.
An investigation by The Sunday Telegraph has uncovered a critical danger placed in the software used by most ambulance services. For years, 999 calls in life-threatening situations like Mrs Masons’s were accidentally “downgraded”, with call handlers told not to send the most urgent response.
While some services spotted the risk, ordering operatives to override the computer’s orders manually, five of England’s 12 ambulance trusts did not allow call handlers to upgrade such calls [A belief that "the computer is omniscient" seems to be the only basis for such an exclusion of human judgment - ed.] They include the East of England ambulance service, which covers Suffolk and which only identified the risk after Mrs Mason’s death.
The danger in the system was created by the country’s most senior ambulance officials as they altered the program used by most control centres in an attempt to manage demand for 999 services.
Most ambulance services use an international computerised system designed in America. In the US version, a fall of more than 6ft receives the maximum priority response. However, the government committee which governs its use in this country decided that such cases should be deemed less urgent [what were they thinking? - ed], and excluded from an eight minute category A target response time.
In doing so, they created a potentially lethal flaw in the system. It meant that if a call involved a fall of more than 6ft it was designated a lower priority – a category B response – despite the presence of life-threatening conditions which were supposed to receive the most urgent category A response.
[NOTE: If the other life threatening conditions were ignored by the computer system after its "first-pass" look at the height of the fall, then who actually created the most severe flaw is unclear to me, those who altered the parameter or those who designed the overall decision support logic - ed.]
As a result, Mrs Mason lay unconscious for more than 38 minutes. The first ambulance sent to her home in the village of Eye, Suffolk, was diverted to attend to a drunk woman who had fallen on a pavement 22 miles away in Thetford, Norfolk. Because the inebriated woman had fallen at ground level, her situation was prioritised over that of Mrs Mason [perhaps because their was no entry of a "height of fall" - ed.], who was close to death by the time paramedics arrived. The East of England ambulance service, which also covers Bedfordshire, Cambridgeshire, Essex, Hertfordshire and Norfolk, said its operatives were instructed never to “override” the advice of the automated system.
Ambulance dispatchers instructed to "never override the advice of the automated system?" Simply stunning if true.
Read the whole story at the link above.
On another note: government committees have rarely worked well in domains where critical thinking is essential. (I can't wait for the comparative effectiveness committees using flawed data from flawed EMR's to start their work. I'd written about that issue here.)
Finally, "Our policy is to always trust the computer" is not a way to run life-critical healthcare services. Ever.