Wednesday, March 04, 2009

IT Vulnerabilities Highlighted by Errors, Malfunctions at Veterans Medical Centers

Bill Gates's company, Microsoft, touts the User Experience as the Sine Qua Non of computing. Billions of dollars have been spent tweaking every little nuance of Windows, with version 7.0 soon to appear. Apple has done likewise with Mac OS X. (The various X-windows managers for Linux, less so). I respect these efforts and use both mainstream OS's in my daily work.

In HIT, however, the "user experience" as I outlined in my eight part series starting here is deemed an issue to solve once the sale is made and physicians are scrambling to avoid harming or killing patients. After all, the HIT industry is unregulated, shielded from liability based on the "learned intermediary" doctrine (a.k.a., clinicians are the bank and insurance company for IT vendors, in that they are the creative implementors of workarounds to IT mismanagement and misdesign), and contractual gag clauses against public disclosure of product defects.

This arrangement makes the pharmaceutical industry look like kids in a sandbox.

In the article "IT Vulnerabilities Highlighted by Errors, Malfunctions at Veterans' Medical Centers", Journal of the American Medical Association 2009;301(9):919-920, author Bridget M. Kuehn illustrates the risk of improperly implemented or managed HIT, even in the VA.

The VA is probably the finest environment in the world for HIT, its development of clinical IT over the past few decades being largely driven by experts from within, and being dedicated to patients and not profits. Yet its HIT is subject to the same issues as HIT in the for-profit HIT sector.

Kuehn writes in JAMA:

Medical errors and software malfunctions that were linked to changes in the electronic medical records system used at Veterans Affairs (VA) medical centers across the country are drawing attention to the potential vulnerabilities of such systems.

Although many advocates of electronic medical records systems promote them as a means to reduce mistakes affecting patient care, the recent problems in the VA system and other evidence suggest that malfunctions of these systems or problems with the way they are implemented have the potential to lead to medical errors.

Drawing attention after how many years of physicians observing such problems in the commercial HIT sector and being muted by HIT companies on such issues, I ask?

How many times will we need to see such articles in print before physicians and patient advocates wake up? How many times would we need to see in print "chiropractors performing neurosurgery is harming patients" before we put a stop to that practice?

How many medical centers have lists and databases of known HIT defects, waiting to be fixed, that their patients and the public do not know of? How many physicians have been threatened with litigation for airing the dirty laundry?

Understanding how such problems occur as well as how they might be prevented is particularly critical as the Obama administration considers health care reforms that include more widespread adoption of electronic records systems in health care.

Indeed, although I would add "... as the Obama administration also moves from a suggested timeline of 2014 to a coercive 'do this or else' timeline."

As I've written, such a change is cavalier and suffers from the misinformation fed the administration by both the HIT ecosystem industrialists and opportunists, and those suffering from the utopian Syndrome of Inappropriate Overconfidence in Computing.

After a software update of the electronic medical records system at VA hospitals in August, health care workers at these facilities began to report that as they moved from the records of one patient to those of a second patient, they would sometimes see the first patient's information displayed under the second patient's name.

If not for the diligence of the users, that type of error could lead to dead patients.

This records-scrambling problem was reported at 41 of the 153 VA medical centers, said Gail Graham, deputy chief officer of Health Information Management at Veterans Health Administration Headquarters in Washington, DC. Graham explained that the jumbling of records was an uncommon occurrence that only occurred after a particular sequence of events.

As I mentioned above, the VA is an example of the finest environment, probably in the world, for HIT, being dedicated to patients and not profits (see the book "Medical Informatics 20/20" by Goldstein et al., and my linked quote, on that culture).

Imagine what goes on in commercial products where vendors' first priority is to profit margin.

Health care workers at the VA medical centers were notified about this potential problem in October, and on December 20, the centers received a software "patch" to fix the problem.

Nine VA medical centers reported another type of problem related to their electronic records system: physician orders to stop medication were missed, causing some patients to receive intravenous medications longer than necessary. The problem occurred because after the software upgrade, physician orders to discontinue such medications, which had previously appeared at the top of the screen, were not displayed.

In 3 cases, patients received infusions of drugs such as heparin for up to 11 hours after their physician had ordered the drug to be discontinued. Graham said the affected patients were not notified because they had not been harmed by the oversights. This software problem was corrected on December 8.

Again, this type of error occurs once too often, your patient's dead.

There are hundreds of types of problems that can occur as a result of such technology, and those experienced by the VA medical centers are not uncommon, according to Ross Koppel, PhD, professor of sociology at the School of Medicine at the University of Pennsylvania in Philadelphia.

For example, the scrambling of two different electronic records is a common problem that is not limited to systems used in health care settings, he said. Additionally, research indicates that poor layout of information in electronic medical records and related health information technology systems is the most common type of flaw (Koppel R et al. JAMA. 2005;293[10]:1197-1203). He explained that when health care workers have to look at several different screens or scroll through pages to access the information needed to make medical decisions, mistakes can occur.

My eight part series came to the same conclusions. In fact, I reached those conclusions and started writing publicly about them, in 1998. Was anyone listening? How many patients have been harmed, or died, in those past ten years due to industry inattention to these issues and leadership of HIT projects by unqualified personnel?

Additionally, medical errors also may arise from poor communication between electronic medical records and related technologies (such as computerized physician order entry systems) or between software applications created by different companies.

Other types of errors are caused not by flawed software but by how workers use that software in real medical settings. For example, an electronic medical record system may require a physician to enter a weight for a patient before prescribing a drug, even if a precise weight is not needed for that particular drug. A physician in a rush may enter an estimate of the patient's weight, but another physician who subsequently views the record might use that estimated weight to prescribe a medication that does require an accurate weight, potentially causing a dosage error. Detecting such problems requires careful observation of how systems are being used in real clinical settings and interviewing clinical staff about their experience, noted Koppel.

I claim these types of errors are due to flawed software. The flaw is in the ill-conceived, management information systems (MIS) inventory-system-mentality, mission hostile user experience the software presents to clinicians.

Although makers of electronic medical records software test it to detect problems that may lead to errors, federal oversight of such testing is lacking. The Certification Commission for Healthcare Information Technology, which is composed of health information technology industry organizations, has a certification program, but critics argue that such oversight is not sufficient and that the US Food and Drug Administration, the Centers for Medicare & Medicaid Services, or a new governmental agency should be given the authority to oversee these systems (Hoffman S and Podgurski A. Harv J Law Tech. 2008;22[1]:104-165).

"The bar on certification has to be raised to a level that ensures real safety, not just minimal compliance," Koppel said.

The CCHIT process is merely a specifications check or qualification of features, and is nearly useless by their own admission regarding most of the issues discussed in these articles. Worse, CCHIT has significant conflicts of interest with the HIT industry, having senior people who also hold senior industry positions and thus have fiduciary responsibilities to their employers (whether that is acceptable for a "certifying" organizations is open to debate, but the conflicts exist as fact). On a more minor point, this organization can't even manage its books properly, failing to file its required reports and temporarily being dissolved, involuntarily.

In fact, let me state that the lack of oversight of this industry amounts to, at best, governmental negligence, and at worst a complete dereliction of obligation to protect the public.

Imagine this type of arrangement in pharma or the (physical) medical device industry.
Almost ten years ago I wrote HIT is a clinical tool for unforgiving medical environments that happens to involve computers, not a management information system that happens to involve doctors.

In other words,
HIT systems are virtual medical devices that affect every aspect of care, and can cause the "learned intermediaries" to commit error as the JAMA article illustrates.

Koppel also urged caution as electronic records are rolled out. Although these technologies offer great promise in providing data for research and quality-improvement initiatives, he said more work is needed to make these systems work effectively in the context of health care.

"We should encourage [the technologies' development, but we should not force doctors to use them until they are shown to be more responsive to the needs of physicians and other clinicians," he said.

As to the "they", I believe Ross Koppel was referring to the computer artifacts themselves.

I would extend the "they" to mean the healthcare IT vendor industry itself, to most of whose leaders I've probably give an "F" regarding these issues and knowledge of Biomedical Informatics.

Excluding NextGen, whose AVP for Government Relations
holds the vaunted and much coveted "American Medical Informatics Certification for Health Information Technology."

Finally, the VA is going to embark on upgrades to its laboratory components not by relying on its internal expertise, but by engaging the services of a private HIT company, Cerner, a company with a troubled record regarding the national IT initiatives in the UK (see here, here).

Considering that company's issues, and the internal complexities, dependencies, intra-component messaging, and other idiosyncracies of a system as complex as VistA, I opine that this is a major error.

Based upon my decades of knowledge about IT, biomedical informatics training, my analytical abilities and my own informed judgment, it is my deep fear that the VA will not "luck out" as in the above JAMA article. Patients will be harmed as a result somewhere down the line, I fear.

-- SS


Addendum 3/5/08:

I am promoting a user comment and my response to the body of this post itself, as I think they exemplify a central issue regarding HIT:


Paul Trossel on March 5, 2009 5:05:00 AM EST said...

Computer software in the Health industry are tools to use. Often the user thinks they are the ‘law’ in what they have to do. As long as people don’t understand the way they have to use their tools (eg surgical scalpel, machines, software etc.) mistakes will be made. Always check the outcome of the computer with your own judgement. As long as supervisors, bosses etc. help their people to understand that the use of the mind of a qualified professional is the outmost important and that the outcomes of the tools are the next, mistakes that lead to the death of a patient, will keep occurring!


My reply:

Mr. Trossel,

Forgive me for saying so, but you sound like an apologist for the Health IT medical device industry, which rightly fears increased scrutiny of its design, development, QC and lifecycle practices.

HIT is a medical device that happens to involve computer automation.

My apologies if I am incorrect, but it appears you're not taken care of a patient in the "fog of war" known as the ED, ICU or busy hospital floor.

Tools that distract, tools that mislead, tools causing cognitive overload are simply dangerous in such an environment. That is a clinical reality.

It is also clear you've not read my post [as above] in its entirety, nor my multi part series on the mission hostile user experience presented by HIT.

It appears even the most highly trained experts, especially when distracted, can be misled by faulty devices.

The recent crash of the Turkish airliner seems a case of just that. See "Altimeter, Crew Cited in Dutch Crash" in the Wall Street Journal.

It is the primary responsibility of the manufacturer of a medical device such as HIT to assure it is the best device possible, accounting for the very well known realities of the clinical workplace.

It is not the responsibility of the end user to be the bank and insurance policy for cavalier software design through tiring improvisation and workarounds to software device mismanagement and design decisions made by incompetents and HIT amateurs.

There is no excuse for ill conceived, designed and implemented HIT devices. None whatsoever, most especially depending on busy human users to compensate for the flaws on a 100% reliability basis.

Resilience engineering, a term I learned when presenting these HIT issues to the IEEE Medical Technology Policy Committee in Dec, 2007 (PPT here), is not about depending on busy people to cover for your device's defects.

Regarding a meta-issue:

It's absolutely a sign of our culture being in great distress when physicians have to answer to IT personnel about why the physicians best not be the "workarounds" and safety valves to ill conceived IT.


-- SS

8 comments:

Anonymous said...

One of the causes of the "misson-hostile" user interface problems you mention is the fact that common desktop operating systems--particularly Windows--were designed primrily for generic office automation. Development environments for these systems, such as Visual Basic in the past and currently Visual Studio, offer a set of interface widgets designed for this purpose. In contrast, systems intended to support high performance in human operators engaged in specialized tasks, such as piloting commercial or military aircraft, have special-purpose interfaces designed specifically for the task at hand. In scope, medicine even broader and more complex, and it certainly deserves specific, high-performance interface design. That kind of work is demanding and expensive, however. It has been much less expensive to hire a cheap Visual Basic jockey and slap together a forms-oriented, tabbed interface by dragging standard widgets from palettes onto a window representation. That's the heritage of most of our current EHR implementations.

InformaticsMD said...

Development environments for these systems, such as Visual Basic in the past and currently Visual Studio, offer a set of interface widgets designed for this purpose. In contrast, systems intended to support high performance in human operators engaged in specialized tasks, such as piloting commercial or military aircraft, have special-purpose interfaces designed specifically for the task at hand.

I see this as a secondary issue.

When one does not understand the domain, one can have the best tools and processes and still produce bad products.

As I've written, a thousand generic workers following the finest of process will always be outperformed by one domain expert who actually knows what they're doing.

I can add to that as follows:

"a thousand generic workers with the finest of development tools, following the finest of process, will always be outperformed by one domain expert with primitive tools but who actually knows what they're doing."

-- SS

Anonymous said...

Computer software in the Health industry are tools to use. Often the user thinks they are the ‘law’ in what they have to do. As long as people don’t understand the way they have to use their tools (eg surgical scalpel, machines, software etc.) mistakes will be made. Always check the outcome of the computer with your own judgement. As long as supervisors, bosses etc. help their people to understand that the use of the mind of a qualified professional is the outmost important and that the outcomes of the tools are the next, mistakes that lead to the death of a patient, will keep occurring!

InformaticsMD said...

Mr. Trossel,

Forgive me for saying so, but you sound like an apologist for the Health IT medical device industry, which rightly fears increased scrutiny of its design, development, QC and lifecycle practices.

HIT is a medical device that happens to involve computer automation.

My apologies if I am incorrect, but it appears you're not taken care of a patient in the "fog of war" known as the ED, ICU or busy hospital floor.

Tools that distract, tools that mislead, tools causing cognitve overload are simply dangerous in such an environment. That is a clinical reality.

It is also clear you've not read my post in its entirety, nor my multi part series on the mission hostile user experience presented by HIT.

It appears even the most highly trained experts, especially when distracted, can be misled by faulty devices.

The recent crash of the Turkish airliner seems a case of just that. See "Altimeter, Crew Cited in Dutch Crash" in the Wall Street Journal.

It is the primary responsibility of the manufacturer of a medical device such as HIT to assure it is the best device possible, accounting for the very well known realities of the clinical workplace.

It is not the responsibility of the end user to be the bank and insurance policy for cavalier software design through tiring improvisation and workarounds to software device mismanagement and design decisions made by incompetents and HIT amateurs.

Anonymous said...

Technology is used as a help in the work and not a hindrance specially in complicated fields like medicine. I believe that the technology used in the medical sector should not only pass through the most stringent quality check but also it should be safeguarded from as much human errors as possible…

Anonymous said...

All true! I'm with you when you state: "the manufacturer of a medical device such as HIT to assure it is the best device possible'. However, it doesn't takes the reasponsibility away from the person who uses this tool to check and double check. Ofcourse in the heath of the moment at a busy hospital floor it might be forgotten or their is just no time. I merely pointed out that it is dangerous to only trust the tool and not take the professional reaponsabilty that belongs by using such devices. Aspecially knowing that their are mallfunctions in practicly all computerised tools. That is the shame of the manufacterer. Nevertheless, better sure than sorry.
And just for the sake of the argument, I'm not a benifaciary for the IT business. I concidder IT dangerous in the hands of people that are soley trusting the works of these tools. More in special the professionaly trained, they should know better!
No it's not the repsonsibilityy of the end user that software divice makers didnt't took their responsibility to deliver a one hundred percent working and thrustworthy device. But it is the responsibility of the buyers of such devices to check and double check if what they are buying does what it suppose to do. And still then, when using be and stay alert. Your example of the crashed Turkish airliner proves that to.

InformaticsMD said...

However, it doesn't takes the reasponsibility away from the person who uses this tool to check and double check.

While I agree in concept, in clinical setting it is impractical.

For example, a reported INR that's low, not at all unusual, will lead to increasing the anticoagulant dose, for example. The INR is not something a doctor can see by examining the patient, but if reported falsely can lead to increased anticoagulants, a bleed, and death.

Finally, if a user has to check and double check to make sure the system is not providing faulty data (which may mean picking up the phone, checking paper, etc.), then why bother having said system, especially if said system costs tens of millions of dollars?

The answer is solid design and resilience engineering - up front.

-- SS

InformaticsMD said...

But it is the responsibility of the buyers of such devices to check and double check if what they are buying does what it suppose to do.

The average hospital cannot do that, considering the level of resources and testing it would take.

Perhaps hospitals should also do that with the drugs they administer.

Oh wait - there's a federal agency with thousands of people who do that. (Not entirely perfectly, but they do it.)

We need the same regulatory oversight for HIT.

-- SS