Thursday, February 28, 2013

Peering Underneath the Iceberg's Water Level: AMNews on the New ECRI "Deep Dive" Study of Health IT "Events"

FDA's Center for Devices and Radiological Health director Jeffrey Shuren MD JD voiced the opinion a few years ago that what FDA knows about health IT risks is the "tip of the iceberg" due to systematic impediments to knowledge gathering and diffusion.   See links to source here and to the FDA Internal Memo on HIT risk - labeled "internal document not intended for public use" and unearthed by investigative reporter Fred Schulte several years ago - here (PDF).

At my Feb. 9, 2013 post "A New ECRI Institute Study On Health Information Technology-Related Events" I opined that a new ECRI study was beginning to peer beneath the waterline of Jeff Shuren's iceberg tip, at what may reside underneath that waterline.  Iceberg tips, needless to say, are usually tiny compared to the iceberg's overall size.

Reporter Kevin O'Reilly at AMNews (amednews.com) has now written about that ECRI report.

The results of the report are concerning:

 Ways EHRs can lead to unintended safety problems

Wrong records and failures in data transfer impede physicians and harm patients, according to an analysis of health technology incidents.

By Kevin B. O'Reilly, amednews staff,
posted Feb. 25, 2013.

In spring 2012, a surgeon tried to electronically access a patient’s radiology study in the operating room but the computer would show only a blue screen. The patient’s time under anesthesia was extended while OR staff struggled to get the display to function properly.

That is just one example of 171 health information technology-related problems reported during a nine-week period to the ECRI Institute PSO, a patient safety organization in Plymouth Meeting, Pa., that works with health systems and hospital associations in Kentucky, Michigan, Ohio, Tennessee and elsewhere to analyze and prevent adverse events.

Eight of the incidents reported involved patient harm, and three may have contributed to patient deaths, said the institute’s 48-page report, first made privately available to the PSO’s members and partners in December 2012. The report, shared with American Medical News in February, highlights how the health IT systems meant to make care safer and more efficient can sometimes expose patients to harm.

 Mar. 1, 2013 addendum.  From ECRI, the denominator is this:


Participating facilities submitted health IT related events during the nine-week period starting April 16, 2012, and ending June 19, 2012. ECRI Institute PSO pulled additional health IT events that were submitted by facilities during the same nine-week period as part of their routine process of submitting event reports to ECRI Institute PSO’s reporting program. The PSO Deep Dive analysis consisted of 171 health IT-related events submitted by 36 healthcare facilities, primarily hospitals.   [I note that's 36 of 5,724 hospital in the U.S. per data from the American Hospital Association (link), or appx. 0.6 %.  A very crude correction factor in extrapolation would be about x 159 on the hospital count issue alone, not including the effects of the voluntary nature of the study, of non-hospital EHR users, etc.  Extrapolating from 9 week to a year, the figure becomes about x 1000.  Accounting for the voluntary nature of the reporting (5% of cases per Koppel), the corrective figure approaches x20,000.  Extrapolation of course would be less crude if # total beds, degree of participant EHR implementation/use, and numerous other factors were known, but the present reported numbers are a cause for concern - ed.]

Sept. 2013 addendum: 

Health Leaders Media has more on the ECRI Deep Dive study at http://www.healthleadersmedia.com/print/TEC-290834/HIT-Errors-Tip-of-the-Iceberg-Says-ECRI:

HIT Errors 'Tip of the Iceberg,' Says ECRI
Cheryl Clark, for HealthLeaders Media , April 5, 2013

Healthcare systems' transitions from paper records to electronic ones are causing harm and in so many serious ways, providers are only now beginning to understand the scope.

Computer programs truncated dosage fields, leading to morphine-caused respiratory arrest; lab test and transplant surgery records didn't talk to each other, leading to organ rejection and patient death; and an electronic systems' misinterpretation of the time "midnight" meant an infant received antibiotics one dangerous day too late.

These are among the 171 health information technology malfunctions and disconnects that caused or could have caused patient harm in a report to the ECRI Institute's Patient Safety Organization.

... The 36 hospitals that participated in the ECRI IT project are among the hospitals around the country for which ECRI serves as a Patient Safety Organization, or PSO.

The 171 events documented, break down like this:
  • 53% involved a medication management system.
    • 25% involved a computerized order entry system
    • 15% involved an electronic medication administration record
    • 11% involved pharmacy systems
    • 2% involved automated dispensing systems
  • 17% were caused by clinical documentation systems
  • 13% were caused by Lab information systems
  • 9% were caused by computers not functioning
  • 8%. Were caused by radiology or diagnostic imaging systems, including PACS
  • 1% were caused by clinical decision support systems

Karen Zimmer, MD, medical director of the institute, says the reports of so many types of errors and harm got the staff's attention in part because the program captured so many serious errors within just a nine-week project last spring.  The volume of errors in the voluntary reports was she says, "an awareness raiser."

"If we're seeing this much under a voluntary reporting program, we know this is just the tip of the iceberg; we know these events are very much underreported."

As at the opening of this post, "tip of the iceberg" is a phrase also used by FDA CDRH director Jeffrey Shuren MD JD regarding safety issues with EHRs and other health IT.

Along those lines, at my April 2010 post "If The Benefits Of Healthcare IT Can Be Guesstimated, So Can And Should The Dangers" I proposed a "thought experiment" to theoretically extrapolate limited data on health IT risk to a national audience, taking into account factors that limited transparency and thus reduced known injury and fatality counts. The results were undesirable, to say the least - but it was a thought experiment only.

Using the current data, coming from a limited, voluntary set of information over 9 weeks, I opine that the results of an extrapolation to a national (or worldwide) level, in an environment of rapidly increasing adopters (many of whom are new to the technology), on an annual basis, not a mere 9 weeks - would not look pretty.

The institute’s report did not rate whether electronic systems were any less safe than the paper records they replaced. The report is intended to alert hospitals and health systems to the unintended consequences of electronic health records.

Ethically, this is really not relevant towards national rollout, especially with penalties beginning to accrue to non-adopters of HHS "Certified" technology in a few years.

As I've written on this blog, medical ethics generally do not condone experimentation without informed consent, especially when the experimental devices are of unknown risk. Not knowing the risks of IT, it really doesn't matter, ethically, what the safety of paper is.  "Hope" is not a valid reason for medical experimentation.  (See below for what a PubMed search reveals about risks of paper records.)

The unspoken truth prevalent in healthcare today seems to be this:  the sacrifice of individual patients to a technology of unknown risk is OK, as long as - we hope -  it advances the greater good.    Perhaps that should be explicitly admitted by the HIT industry's hyper-enthusiast proponents who ignore the downsides, so the spin can be dropped and there can be clarity?

The leading cause of problems was general malfunctions [also known by the benign-sounding euphemism "glitches" - ed.]  responsible for 29% of incidents. For example, following a consultation about a patient’s wounds, a nurse at one hospital tried to enter instructions in the electronic record, but the system would not allow the nurse to type more than five characters in the comment field. Other times, medication label scanning functions failed, or an error message was incorrectly displayed every time a particular drug was ordered. One system failed to issue an alert when a pregnancy test was ordered for a male patient. [These 'general malfunctions' are thus not just computer bugs undetected due to inadequate pre-rollout testing, but also examples of design flaws due to designer-programmer-seller-buyer-implementer lack of due diligence, i.e.,  negligence - ed.]

A quarter of incidents were related to data output problems, such as retrieving the wrong patient record because the system does not ask the user to validate the patient identity before proceeding. This kind of problem led to incorrect medication orders and in one case an unnecessary chest x-ray. Twenty-four percent of incidents were linked to data-input mistakes. For example, one nurse recorded blood glucose results for the wrong patient due to typing the incorrect patient identification number to access the record.  [Many of these are likely due to what NIST has termed "use error" - user interface designs that will engender users to make errors of commission or omission - as opposed to "user error" i.e., carelessness - ed.]

Most of remaining event reports were related to data-transfer failures, such as a case where a physician’s order to stop anticoagulant medication did not properly transfer to the pharmacy system. The patient received eight extra doses of the medication before it was stopped. [Due to outright software, hardware and/or network problems and defects - ed.]

I've been writing about such issues since 1998, not because I imagined them.  As a CMIO I saw them firsthand; as teacher and mentor I heard about them from colleagues; as a writer I heard about them via (usually unsolicited) emails from concerned clinicians; as an independent expert witness on health IT harms I've heard about them from Plaintiff's attorneys, but not from the Defense side of the Bar as yet.  Of course the reasons for that are understandable -  albeit disappointing.

In fact, robust studies of a serious issue - the actual risks of paper towards harm causation - and further, whether any of the issues are remediable without spending hundreds of billions of dollars on IT - seem scarce.  I've asked the PA Patient Safety Authority about the possibility of using data in the Pennsylvania Patient Safety Reporting System (PA-PSRS) database, just as they did for EHR-related medical events, to determine incidence of paper-related medical events.  They are pondering the issue.

As an aside, I note that it would be ironic if the relative risks of both IT and paper were not really robustly known.  (I note that in a PubMed search on "risks of paper medical records", not much jumps out.)  IT hyper-enthusiasts will not even debate the issue of studying whether a good paper system might be safer for patients in some clinical environments than bad health IT.

Considering the tremendous cost and unknown risk of today's health IT (and perhaps the unknown risk of paper, too), would it not make more sense, and be consistent with the medical Oath, to leave paper in place where it is currently used - and perhaps improve its performance - until we "get the IT right" in controlled, sequestered environments, prior to national rollout?

In other words, as I've asked before on these pages, should we not slow down the IT push and adhere to traditional (and hard-learned) cautions on medical research?

Even asking such questions brings forth logical fallacies such as straw arguments (e.g., UCSF's Bob Wachter in a recent discussion I initiated with several investigative reporters: "...where we part ways is your defense of paper and pencil. I understand advocacy, and you have every right to bang this particular drum"), ad hominem attacks, etc.

... It is not enough for physicians and other health care leaders to shop carefully for IT systems, the report said. Ensuring that systems such as computerized physician order entry and electronic health records work safely has to be a continuing concern, said Karen P. Zimmer, MD, MPH, medical director of the ECRI Institute PSO.

“Minimizing the unintended consequences of health IT systems and maximizing the poten­tial of health IT to improve patient safety should be an ongoing focus of every health care organization,” she said.

I recommended that clinicians take matters into their own hands if their leaders do not, as at the bottom of my post here.  This advice bears repeating:

... When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

(DISCLAIMER:  I am not responsible for any adverse outcomes if any organizational policies or existing laws are broken in doing any of the following.)


  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.
  • As clinicians are often forced to use health IT, at their own risk even when "certified" (link), if a healthcare organization or HIT seller is sluggish or resistant in taking corrective actions, consider taking another risk (perhaps this is for the very daring or those near the end of their clinical career). Present your organization's management with a statement for them to sign to the effect of:
"We, the undersigned, do hereby acknowledge the concerns of [Dr. Jones] about care quality issues at [Mount St. Elsewhere Hospital] regarding EHR difficulties that were reported, namely [event A, event B, event C ... etc.]

We hereby indemnify [Dr. Jones] for malpractice liability regarding patient care errors that occur due to EHR issues beyond his/her control, but within the control of hospital management, including but not limited to: [system downtimes, lost orders, missing or erroneous data, etc.] that are known to pose risk to patients. We assume responsibility for any such malpractice.

With regard to health IT and its potential negative effects on care, Dr. Jones has provided us with the Joint Commission Sentinel Events Alert on Health IT at http://www.jointcommission.org/assets/1/18/SEA_42.PDF, the IOM report on HIT safety at http://www.modernhealthcare.com/Assets/pdf/CH76254118.PDF, and the FDA Internal Memorandum on H-IT Safety Issues at http://www.scribd.com/huffpostfund/d/33754943-Internal-FDA-Report-on-Adverse-Events-Involving-Health-Information-Technology.

CMO __________ (date, time)
CIO ___________ (date, time)
CMIO _________ (date, time)
General Counsel ___________ (date, time)
etc."
  • If the hospital or organizational management refuses to sign such a waiver (and they likely will!), note the refusal, with date and time of refusal, and file away with your attorney. It could come in handy if EHR-related med mal does occur.
  • As EHRs remain experimental, I note that indemnifications such as the above probably belong in medical staff contracts and bylaws when EHR use is coerced.

These recommendations still stand, although after this recent story, my caution about retaliation should be re-emphasized:

The Advisory Board Company
Feb. 14, 2013
Hospital Framed Physician; Planted a Gun

-- SS

2 comments:

Anonymous said...

System unavailability is common. Crashes of all records are common. Administrators who sham peer review those who complain, are quick to say the industry mantra after such crashes: "patient care was not affected".

InformaticsMD said...

Anonymous said...

System unavailability is common. Crashes of all records are common. Administrators who sham peer review those who complain, are quick to say the industry mantra after such crashes: "patient care was not affected".

While the risk levels (e.g., rates) of health IT-related medical mishaps are slowly becoming known, robust data on the rates of paper-related medical errors may not exist.

With data on paper's error rates lacking, and with the unknown error rate of IT (though the latter is starting to come out), the "ready-fire-aim" rush to IT is an act of cybernetic "faith" and "hope".

However, the faith and hope happens to ignore the Oath, and either takes the view that it's OK to sacrifice some patients for the "greater good" - or the cyber-faithful don't bother to think about the issues at all, which represents an ethical vacuum or perhaps nihilism.

Not a way to run an airline.

-- SS