Wednesday, February 13, 2013

Guest post by Dr. Jon Patrick, U. of Sydney: On the ECRI Institute's recommendations following a "deep-dive" study of HIT-related events

At my Feb. 9, 2013 post "ECRI Institute PSO Uncovers Health Information Technology-Related Events in Deep Dive Analysis", I wrote about an ECRI Institute study of well-defined client data submitted over a 9-week period (a "deep dive" study) on EHR problems.  Risk-creating events at a level that should be a significant concern to patients and clinicians fell into the following categories: 

  • inadequate data transfer from one HIT system to another 
  • data entry in the wrong patient record
  • incorrect data entry in the patient record
  • failure of the HIT system to function as intended 
  • configuration of the system in a way that can lead to mistakes

Dr. Jon Patrick at U. Sydney, a scientist and medical informatician who sees the issues from a number of unique perspectives (link to bio), has offered the following critique of the key recommendations made by the ECRI Institute as a result of this study of health IT events (see aforementioned post for the text of those recommendations). 

One caveat is that neither he nor I have the full report as of this writing in part due to expense:

The ECRI has produced a report on HIT errors. I was concerned about the manner in which the Key Recommendations minimised, generalised or failed to concretise important issues that I offer these comments.

On ECRI Key Recommendations:

1. Enlist leaders’ commitment and support for the organization’s health IT projects.

JP: This is clearly a comment from one level of  management to those levels above it. It doesn't seem to be a comment about the nature of the HIT itself but more about the internal processes to getting it established. There is really not much to say here because if upper management don't want to do anything about the introduction of HIT then engaging in the task is probably futile, hence it was difficult to understand why it was included.

2. Involve health IT users in system planning, design, and selection.

JP: yes, this is crucial to successful solutions.

3. Conduct a review of workflow and processes to determine how they must be modified.

JP: the missing clause at the end of this sentence is "to fit the introduced HIT system". This is putting the cart before the horse. The notion that a piece of software designed by engineers should have priority over the team of clinicians who are experts in their business is highly naive. Also this statement is in contradiction to point 2 above.

The privileging of the functioning of  the IT system as the expert over that of the clinical team, AND making a recommendation in contradiction to an earlier recommendation is absurd enough by itself  to undermine the credibility of  the report.

4. Evaluate the ability of existing IT systems within the organization to reliably exchange data with any health IT system under consideration.

JP: And after the evaluation what should be done. This is hardly an informative recommendation. It is of the form "Do Something, it doesn't matter what you do, just do something."  A more compelling statement that would have enhanced the credibility of the authors and demonstrate their knowledge of the industry would have been: " Establish unambiguously and conclusively that any health IT system under consideration has the ability to reliably exchange data with existing IT systems within the organization". Note the focus needs to be on the incoming system being the object of conformity not the incumbent systems.

5. Conduct extensive tests before full implementation to ensure that the health IT system operates as expected.

JP: this is a clearly a desirable goal and applies to point 4 above as well as all other aspects of the system functions.

6. Provide user training and ongoing support; educate users about the capabilities and limitations of the system.

JP: This is a motherhood statement that does nothing to contribute to the nub of training issues in the use of HIT. Training is an ongoing issue for two reasons: staff change and new staff come on to the roster and need to be trained, and, these systems are complex and expert advice and skill needs to be available to staff as an ongoing service so that work is not delayed due to confusion on how to use the IT. An alternative proposal to reduce training costs and investment would be to design the system so that it matches current best practice of the clinical team so that learning the system is minimised because it fits seamlessly into their work processes.

Surely the issue of educating users about the capabilities and limitations of the system is independent of the training topic. The users will discover on their own bat the limitations of the system as they try to use it.

7. Closely monitor the system’s ease of use and promptly address problems encountered by users.

JP:This is a quite inadequate description of the action that needs to be taken. The real consideration is for them to have an effective avenue for expressing the limitations in a manner that is acceptable and they are NOT chastised as malcontents [or Luddites or technophobes - ed.] AND that the unnecessary or dangerous limitations are addressed promptly.

8. Introduce alterations to a health IT system in a controlled manner.

JP: It is hard to understand what is meant by this statement. Firstly I would have said "introduce alterations to the clinical processes  in a controlled manner". Secondly, alterations to the HIT system need to be done promptly  when it is defective. Once again, as in point 3 above the technology is being preferenced over the clinical processes which I argue is back-to-front.

9. Monitor the system’s effectiveness with metrics established by the organization.

JP: This is definitely something that should be done but it also needs a more directed purpose. It is easy to say to assess patient safety but most systems don't readily capture relevant data or supply analytical devises to easily draw a picture of patient  safety. However there is an equally important factor in care that is often overlooked and that is staff productivity and by implication morale. There is a need to understand better ways of assessing staff productivity in positive ways and the effect a CIS has on staff morale.  Typical comments about CISs is that they cost staff  up to 20% more time to maintain the patient record. Staff wouldn't mind giving up that time if it was returned in some other form, but it isn't, so they feel the time is stolen from them or from their patients.

10. Require reporting of health IT- related events and near misses.

JP: This is poorly stated and is probably meant to read "health IT - related adverse events". This is a positive recommendation but does not go far enough. A more comprehensive statement would be "Install a process and technology for reporting adverse events and establish methods for evaluating the reports and recommending changes to clinical practice, or technology functionality and implementing them.

11. Conduct thorough event analysis and investigation to identify corrective measures.

JP: This  goes some way to dealing with my point in item 10 above. Its weakness is in not identifying the types of problems and taking a firm stance on completing the correct action.

SOME of the Items that have been OVERLOOKED

1. The system needs to be alterable at any time by the clinical leadership so as to immediately emplace revised work practices when they are approved.

2. The system needs to have native analytics so that the clinical leadership can monitor the activities of their teams and provide evidenced based feedback to support continuous process improvement.

3. The system needs highly accurate natural language processing so that the search for required content though the clinical record can be fast, efficient and reliable.

Professor Jon Patrick
Chair of Language Technology
School of Information Technologies
Faculty of Engineering
University of Sydney
Health Information Technologies Research Laboratory

This is excellent advice coming from perhaps the only computing expert in the world who's conducted a forensic analysis of a major commercial EHR product, one intended for high-risk ED's.  His analysis of that product, intended for government-mandated rollout in public hospitals in Dr. Patrick's Australian state of New South Wales, is at this link.

This work has largely been ignored by the health IT industry and the academic Medical Informatics community.  However, I can assure the industry and the academic community that the report will not be ignored by those in the legal community if (perhaps I should say when) patients are injured or killed by the system's deficiencies.

-- SS


Steve Lucas said...

I remember reading a couple of years ago about a successful EMR installation in a hospital. The first and never broken rule was: Doctors and nurses are in charge.

The computer people would bring a small piece of computerization into the work flow and then staff would have to pass on its approval or modification. This then became part of the building process for a complete system.

The second rule was: Upgrades were to improve the system, not generate fees for the computer people. We see in many instances computer upgrades whose only purpose is to generate income or work for someone involved in the system. The disruption can be monumental.

This whole process took 10 years. A far cry from the short term computer projects we see today with follow on upgrades leaving everyone guessing about how to use the system and a system that is never integrated into the work flow of those involved.

Steve Lucas

Anonymous said...

Interfaces are failing, patients are not getting good care, medications are delayed. I never saw such horrific snafus with the infrastructures prior to the HIT experiment.