Pages

Wednesday, December 22, 2010

Unintended errors with EHR-based result management: a case series, and a special pleading for health IT

As I wrote at "Report of an AMIA special task force on challenges in ethics, safety, best practices, and oversight regarding HIT", articles in the premier journal of Medical Informatics, the Journal of the American Medical Informatics Association (JAMIA) on real and potential downsides of health IT appear to be becoming a trend.

Another article just appeared in JAMIA as the result of a study of healthcare IT related errors: "Unintended errors with EHR-based result management: a case series"; Thomas R Yackel and Peter J Embi; JAMIA 2010 17: 104-107; doi: 10.1197/jamia.M3294.

The article presents a series of health IT-related errors and categorizes them systematically, and thus adds to our knowledge on the issue of cybernetic clinical test results management. It also makes recommendations for increased vigilance and remediation.

The abstract is below (access to the article itself requires a JAMIA subscription.)

ABSTRACT

Test result management is an integral aspect of quality clinical care and a crucial part of the ambulatory medicine workflow. Correct and timely communication of results to a provider is the necessary first step in ambulatory result management and has been identified as a weakness in many paper-based systems. While electronic health records (EHRs) hold promise for improving the reliability of result management, the complexities involved make this a challenging task. Experience with test result management is reported, four new categories of result management errors identified are outlined, and solutions developed during a 2-year deployment of a commercial EHR are described. Recommendations for improving test result management with EHRs are then given.

The article begins:

Over a 2-year period from 2005 to 2007, coinciding with the first 2 years of a planned 3-year deployment of the ambulatory EHR to multiple practice sites, the vast majority of laboratory result routing events functioned as intended. However, seven error types were identified as causing a substantial delay or disruption in result delivery to providers’ electronic inboxes [no statement is made about patient harm or "close calls" that may have resulted - ed.] and led to further investigations and case finding by our group.

Upon analysis, these seven error types were logically grouped into four distinct error categories: (1) interface and results routing logic errors, (2) provider record issues, (3) EHR system settings, and (4) system maintenance.

This was at OHSU, a leading institution in medical informatics, not at some organization that's a newcomer to health IT.

Each of the "error categories" is described in some detail. The article then makes recommendations for improved systems, which sound simple, but are going to be far more resource intensive on a national scale than meets the eye:

1. Develop fault-tolerant systems that automatically report delivery
failures.
2. Use robust testing to find rare errors that occur both within and between systems.
3. Implement tracking mechanisms for critical tests, such as cancer screening and diagnostics.
4. Deliver results directly to patients.

I find myself uncomfortable with the possible human resource costs of implementing the recommendations, especially on a national scale. These costs would be over and above the hundreds of millions per institution and the hundreds of thousands per private doctor already spent, or planned to be spent.

My other issue regarding the article (my main issue, actually) is its editorializing for a product, health IT, in a scientific article, and making a special pleading for the technology.

The next to last paragraph of the article appears more of an editorial, perhaps to make vendors comfortable, than a scientific statement of fact supported by the article:

Finally, while it might be tempting to attribute the errors noted above to the use of a particular health information system or even Health IT in general, an examination of the cases reveals that most of these errors actually resulted from local configuration and implementation decisions rather than to the technologies themselves. Indeed, the authors believe that these cases further support the emerging truism [wow! This is news to me - ed.] that errors related to Health IT are in most cases the result of human error in the implementation of new information and communication systems into our existing complex healthcare environments.[10] Therefore, we contend that the main lesson arising from these cases is that care must be taken by those responsible for implementing health information systems to remain aware of the kinds of errors that might occur and monitor for the unexpected consequences that will undoubtedly take place, but not to avoid use of such systems that likely have the capacity for far greater benefit than harm, if implemented and monitored properly.

In this paragraph the authors state: "... while it might be tempting to attribute the errors noted above to the use of a particular health information system or even Health IT in general, an examination of the cases reveals that most of these errors actually resulted from local configuration and implementation decisions rather than the technologies themselves."

Sept. 2011 addendum: the article was written before Dr Jon Patrick's expose of examples of the internal flaws of commercial health IT, as here: link, link.

As for "rather than the technologies themselves", technologies themselves are never a problem by themselves, even the atomic bomb. In a reductio ad absurdum, which is maybe not so absurd, it took a B29 Superfortress to drop two A-bombs; the bombs could have been deactivated and put in a museum instead.

However, consider a poorly designed A-bomb that could unpredictably go "BOOM" - now that would be a problem.

While I agree some errors are due to mismanaged implementation, in the article, no differentiation is made of design issues vs. implementation (i.e., local configuration and implementation decisions). Yet fundamental design is crucial, according to industry leaders and non-industry experts, in areas that cannot be vastly improved by local configuration decisions:

HIMSS's former Chairman of the Board admits the technology remains experimental:

... We’re still learning, in healthcare, about that user interface. We’re still learning about how to put the applications together in a clinical workflow that’s going to be valuable to the patients and to the people who are providing care. Let’s be patient. Let’s give them a chance to figure out the right way to do this. Let’s give the application providers an opportunity to make this better;

While HIMSS itself admits in this 2009 PDF that

"Electronic medical record (EMR) adoption rates have been slower than expected in the United States, especially in comparison to other industry sectors and other developed countries. A key reason, aside from initial costs and lost productivity during EMR implementation, is lack of efficiency and usability of EMRs currently available";

While the National Research Council (the highest scientific authority in the U.S.) last year reported that:

"Current Approaches to U.S. Health Care Information Technology are Insufficient" and that the technology "does not support clinicians' cognitive needs." The study was chaired by Medical Informatics pioneers Octo Barnett (Harvard/MGH) and William Stead (Vanderbilt);

It is very difficult if not impossible to make a clinical IT silk purse out of a poorly designed sow's ear, no matter how many sound
"local configuration and implementation decisions" are made.

Further, it is stated in the JAMIA article that human errors in implementation as the cause of health IT woes are an "emerging truism".

Making the case that some observation reflects a "truism" is a powerful claim. Such a claim deserves more than one reference, but here's what we have:

"... the authors believe that these cases further support the emerging truism that errors related to Health IT are in most cases the result of human error in the implementation of new information and communication systems into our existing complex healthcare environments" [10].

10. Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc 2004;11:104–12.

Perhaps the term "truism", emerging or otherwise, should be avoided in 2010 regarding errors related to health IT.

The authors contend, presumably from the above observations that:

... we contend that the main lesson arising from these cases is that care must be taken by those responsible for implementing health information systems to remain aware of the kinds of errors that might occur and monitor for the unexpected consequences that will undoubtedly take place

"Might occur?" How about "that do occur" - as in the paper? Above all, these involve patients.

Unexpected consequences - these involve patients, too.

My relative was nearly killed by "unexpected consequences:" of health IT in May 2010.
Perhaps that makes me less cavalier about health IT.

In fact, the certainty that UC's will "undoubtedly take place" reaffirms that these are still experimental technologies.

I remind that it might be best to focus on fundamental design issues before expensive systems are put into place that can cause errors and unexpected consequences, because these are mission critical systems involving live patients who have not, incidentally, been afforded informed consent to the use of these medical devices in their healthcare.

Another editorial comment follows:

[the lesson is that those responsible should remain aware] but not to avoid use of such systems that likely have the capacity for far greater benefit than harm, if implemented and monitored properly

Once again, this is an editorial and value judgment. Who knows if ultimately health IT has a capacity for far greater benefit than harm? If these systems will have predictable, unexpected consequences, how do we know that? Why should critical-thinking practitioners not avoid such systems for now until a better understanding of how to design them to improve usability and support clinician cognition is achieved?

Why put patients at risk en masse as part of a national experiment when studies even at advanced HIT sites show fundamental problems that could harm or kill?

I argue this paper and others that are "emerging" on the downsides and lack of ROI of health IT make the case for great caution and slowness (i.e., avoidance) in their adoption.

Yet the authors seek special accommodation for this technology, something that is perhaps unprecedented with (unregulated) medical devices of unknown risk.


The lesson is actually that we need to slow down with HIT; reboot and start to solve the problems of this technology before national rollout attempts.


This is the ethical position regarding any experimental medical technology that is proving risky at a level not clearly known.

-- SS

8 comments:

  1. These guys point out prima facie evidence of the need for the FDA to vet these devices for safety and efficacy.
    The design flaws of these simple data repositories are legion. They do not put the results in the doctors' hands...someone has to look for them. These guys ignore the dangers of hospital electronic paperless data repositories, where disease critical data with vital interval changes are nt seen for hours because, "who knew?" the results were there?

    Bottom line: These devices facilitate data sliding through the cracks, unless, ie UNLESS, someone is scanning all of the data silos of every patient continuously and bringing new results to the attention of the captain of the ship.

    Personally, I think it stinks when I call a nurse and ask for the potassium results and she/he says it is "normal" because they do not have enough time to run the clickorrhea game of logging on and finding the result. When the k+ goes from 3.6 mg% to 4.8mg%, yes, it is technically "normal", but the interval change is NOT.

    Patients die from these "normal" results, eg the next day when the k+ is 6.5mg%.

    ReplyDelete
  2. They have a brilliant solution_deliver results to the patients. What an incredible advance!!!! Nobel prize, here!

    Wow, and use robust testing!! How ingenious!!Why not do that before the crap is put in to hospitals?

    And tracking for critical test!! Could never have thought of that one, but, why not for all tests? Like, why not automatically print all the results when they arrive, as if a monitor would spew out the strip of ventricular tachycardia?

    Oh, and reporting delivery failures??? Why not get it right the first time? In so doing, pull the defective stuff out of use right now?

    Thes authors know how bad their devices are, but do not want to admit it.

    ReplyDelete
  3. I stumbled onto this forum while googling for the proper software to use a McKesson product. After following the instructions from the hospital IT department, I found instructions that were conflicting and inadequate. I am still unable to use the product as McKesson told the hospital it would. This software inadequacy has consumed hours of my time and leaves me unable to make an important clinical recommendation until the IT department is available tomorrow. Now I find out that McKesson's former CEO is a crook. The local HIT systems have created an unsafe environment for patient care for those of us in clinical practice. It is a daily routine to have to search for reports after finding out that a test was done or that a patient was hospitalized, as well as looking for reports that are misfiled electronically. Sometimes this involves wading through a lot of meaningless and unneccessary data to find the important information. There is a lot of electronic "fudging" of the records with boilerplates, carry-over information, and robodoc information. Some records are obviously fraudulent. Why hasn't the safety issue been pursued before? If the vendors are requiring indemnification contracts, why aren't they being sued for fraudulently promoting the benefits of their products without informing us of their shortcomings?

    ReplyDelete
  4. A false truism...are these authors for real? They use a reference of 2004 to defend their statements on user error truisms. May have been a truism then, at least the pr dogma of the vendors to shut up the complainers, but it is not now.

    Are they kidding me? As they say, c'mon man.

    ReplyDelete
  5. Anonymous December 22, 2010 10:00:00 PM EST writes:

    Now I find out that McKesson's former CEO is a crook.

    Believe you are referring to some financial irregularities promulgated by the former CEO of HBOC, which was acquired by McKesson.

    Why hasn't the safety issue been pursued before?

    I call it a "syndrome of inappropriate overconfidence in computers." Plus, somehow the IT industry's convinced the government that medicine can be "revolutionzed" with cybernetics, despite the fact that most medical errors have nothing to do with documentation or lack thereof. Finally, there's a tremendous amount of money to be made; blood for computers, so to speak.

    If the vendors are requiring indemnification contracts, why aren't they being sued for fraudulently promoting the benefits of their products without informing us of their shortcomings?

    Read the many other stories on this blog about how pharma and other medical device manufacturers ply their trade for answers to that line of questions.

    -- SS

    ReplyDelete
  6. Anonymous December 22, 2010 10:48:00 PM EST writes:

    A false truism...are these authors for real?

    Theirs is a political statement, not a scientific one.

    -- SS

    ReplyDelete
  7. Scot,
    Actually, a former McKesson Chairman of the Board, its General Counsel, and other executives all pleaded guilty in a case involving fraud charges at both McKesson and HBOC. See our post here:
    http://hcrenewal.blogspot.com/2009/11/former-mckesson-ceo-and-board-chairman.html

    ReplyDelete
  8. I stand corrected, although McCall originated from HBOC. He was former chair of McKesson Corp. and leader of HBO & Co. before McKesson bought it.

    As HBOC CEO he was a presenter at the ca. 1997 MS-HUG conference I attended and have written about.

    At the meeting several HIT CEO's held a quasi-religious sermon about how HIT would "revolutionize medicine."

    I asked how many in the room had ever practiced medicine or read a medical text such as The Merck Manual. The responses were nearly all negative.

    I then asked how such people would "revolutionize medicine." The CEO's had no good answer other than nonsense.

    -- SS

    ReplyDelete