Sunday, March 21, 2010

Who was at fault here? UK Woman left to die after computer decision support blunder

Who was at fault here? Those who modified the fall height parameters, those who designed the decision support system such that it could override life threatening problems based on a single parameter, endusers, their managers, or all?

Or was the problem the syndrome of inappropriate confidence in computers (SICC syndrome)?

I opine all of the above, in this cautionary tale:

Woman left to die after 999 ambulance blunder
By Laura Donnelly, Health Correspondent
Published: 9:00AM GMT 21 Mar 2010

An investigation into a woman’s death has exposed a catastrophic decision by ambulance chiefs which may have cost hundreds of lives.

The blunder arose when call centre staff were not warned of flaws with a computer system that prioritises emergencies before dispatching ambulances.

Bonnie Mason, 58, fell down the stairs and died from a head injury after 999 controllers in Suffolk failed to identify her situation as “life-threatening”.

An investigation by The Sunday Telegraph has uncovered a critical danger placed in the software used by most ambulance services. For years, 999 calls in life-threatening situations like Mrs Masons’s were accidentally “downgraded”, with call handlers told not to send the most urgent response.

While some services spotted the risk, ordering operatives to override the computer’s orders manually, five of England’s 12 ambulance trusts did not allow call handlers to upgrade such calls [A belief that "the computer is omniscient" seems to be the only basis for such an exclusion of human judgment - ed.] They include the East of England ambulance service, which covers Suffolk and which only identified the risk after Mrs Mason’s death.

The danger in the system was created by the country’s most senior ambulance officials as they altered the program used by most control centres in an attempt to manage demand for 999 services.

Most ambulance services use an international computerised system designed in America. In the US version, a fall of more than 6ft receives the maximum priority response. However, the government committee which governs its use in this country decided that such cases should be deemed less urgent [what were they thinking? - ed], and excluded from an eight minute category A target response time.

In doing so, they created a potentially lethal flaw in the system. It meant that if a call involved a fall of more than 6ft it was designated a lower priority – a category B response – despite the presence of life-threatening conditions which were supposed to receive the most urgent category A response.

[NOTE: If the other life threatening conditions were ignored by the computer system after its "first-pass" look at the height of the fall, then who actually created the most severe flaw is unclear to me, those who altered the parameter or those who designed the overall decision support logic - ed.]

As a result, Mrs Mason lay unconscious for more than 38 minutes. The first ambulance sent to her home in the village of Eye, Suffolk, was diverted to attend to a drunk woman who had fallen on a pavement 22 miles away in Thetford, Norfolk. Because the inebriated woman had fallen at ground level, her situation was prioritised over that of Mrs Mason [perhaps because their was no entry of a "height of fall" - ed.], who was close to death by the time paramedics arrived. The East of England ambulance service, which also covers Bedfordshire, Cambridgeshire, Essex, Hertfordshire and Norfolk, said its operatives were instructed never to “override” the advice of the automated system.

Ambulance dispatchers instructed to "never override the advice of the automated system?" Simply stunning if true.

Read the whole story at the link above.

On another note: government committees have rarely worked well in domains where critical thinking is essential. (I can't wait for the comparative effectiveness committees using flawed data from flawed EMR's to start their work. I'd written about that issue here.)

Finally, "Our policy is to always trust the computer" is not a way to run life-critical healthcare services. Ever.

-- SS


Anonymous said...

The computer is king. There is no such thing as a learned intermediary in the scenario described.

Death to the unfortunate who were not considered by the programmers.

This is a sentinel example of the cognitive blunting, professional depredation, apathy, and powerlessness generated by the computerization of decision making.

nama tampilan said...

one of the impact of technology, health may need a more accurate precision because it is about human life, who wrote his code in a computer. whether this is evidence that the computer had no more loyal to humans? heheh nice post

Anonymous said...

Also from the 17 March 2010 BBC:

Hospitals 'should axe thousands more beds'

By Nick Triggle
Health reporter, BBC News

"The hospital bed count has been falling for decades

Thousands of hospital beds in England should be axed to save money and improve care, a think tank says.

Centre right group Reform said in some areas up to a quarter of beds could go.

It said advances in technology and rising rates of conditions like diabetes meant the focus should shift towards more community services."

It seems there is a complete and total belief in technology, and this technology can reduce the need for doctors and nurses to care for patients. I am in good health, and it would take a fall of much less than 6 ft. for me to be concerned something broke.

I have to wonder if this belief prevails with those who voted for the health care reform package in the US.

Steve Lucas

MedInformaticsMD said...

Steve, as I have written before:

computer + idiots ≠ Einstein.

Rather, as we see from this example:

computer + idiots = disasters waiting to happen.

Michelle W said...

What's sad is that this was a known problem with a solution that was not implemented, forcing users to "ad hoc" make it work. Such an inability to improve technology is doomed to failure: what if antivirus software worked that way?

Many people panic when a computer flashes a warning message, especially since they're often full of unfriendly language (like "Total System Error" or a string of numbers which in most peoples' minds equals death). As someone with a great deal of technology experience, I have a more balanced, less heartwrenching reaction to errors, and use my on judgement on what I can/can't do on the computer.

I think the next generation of tech users will also develop more pragmaticism to the "computer is god" paranoria that many have today, since they'll be less afraid to try things that don't fit into the established software/hardware mold. To them, the computer is a tool that helps them in life, a chief tool, but as ordinary and part of their lives as the microwave or washing machine. After all, these are the kids who quickly learn how to get around blocking programs to access inappropriate content: I don't think a big warning message will deter them if they're bent on a goal.

MedInformaticsMD said...

Michelle W writes:

I think the next generation of tech users will also develop more pragmatism to the "computer is god" paranoia that many have today, since they'll be less afraid to try things that don't fit into the established software/hardware mold.

I hope you are correct.

-- SS