Showing posts with label jeffrey shuren. Show all posts
Showing posts with label jeffrey shuren. Show all posts

Wednesday, April 09, 2014

FDA on health IT risk: reckless, or another GM-like political coverup? "We don't know the magnitude of the risk, and what we do know is the tip of the iceberg, but health IT is of 'sufficiently low risk' that we don't need to regulate it."

In my Sept. 16, 2013 post "An Open Letter to David Bates, MD, Chair, ONC FDASIA Health IT Policy Committee on Recommendations Against Premarket Testing and Validation of Health IT" (http://hcrenewal.blogspot.com/2013/09/an-open-letter-to-david-bates-md-chair.html) I wrote:

David Bates, Chair, ONC FDASIA Health IT Policy Committee
via email
   
Dear David, I am disappointed (and in fact appalled) at the ONC FDASIA Health IT Policy Committee's recommendations that health IT including typical commercial EHR/CPOE systems not be subjected to a premarket testing and validation process.  I believe this recommendation is, quite frankly, negligent.

The resultant April 2014 report is out: "FDASIA Health IT Report: Proposed Strategy and Recommendations for a Risk-Based Framework."  It is available at http://www.healthit.gov/sites/default/files/fdasiahealthitreport_final.pdf.

The report is an academic idealist's and industry's dream.  It is a patient's nightmare.

Instead of regulation, the report recommends the creation of a nebulous "Health IT Safety Center" (pgs. 16 and 25) as a "public-private entity with broad stakeholder engagement, that includes a governance structure for the creation of a sustainable, integrated health IT learning system that avoids regulatory duplication and leverages and complements existing and ongoing efforts.In other words, a toothless organization without true regulatory authority.

The recommendations of the Health IT Policy Committee have apparently been taken seriously by FDA. 

This FDA announcement recently appeared (FDA is part of HHS):

http://www.fda.gov/NewsEvents/Newsroom/PressAnnouncements/ucm390988.htm

HHS News Release
For Immediate Release: April 3, 2014

Proposed health IT strategy aims to promote innovation, protect patients, and avoid regulatory duplication

HHS today released a draft report that includes a proposed strategy and recommendations for a health information technology (health IT) framework, which promotes product innovation while maintaining appropriate patient protections and avoiding regulatory duplication. The congressionally mandated report was developed in consultation with health IT experts and consumer representatives and proposes to clarify oversight of health IT products based on a product’s function and the potential risk to patients who use it.

... As proposed in the draft report, posted on the ONC, FDA and FCC websites, there would be three health IT categories, based on function and level of risk, that focus on what the product does, not on the platform on which it operates (mobile medical device, PC, or cloud-based, for example).

The first category, products with administrative health IT functions, poses little or no risk to patient safety and as such requires no additional oversight. They include software for billing and claims processing, scheduling, and practice and inventory management.

The second category, products with health management heath IT functions, includes software for health information and data management, medication management, provider order entry, knowledge management, electronic access to clinical results and most clinical decision support software.

Products with health management health IT functions are of sufficiently low risk and thus, if they meet the statutory definition of a medical device, FDA does not intend to focus its oversight on them. Instead, the draft report proposes relying primarily on ONC-coordinated activities and private sector capabilities that highlight quality management principles, industry standards and best practices.

The characterization of these admitted medical devices as "sufficiently low risk" is in my opinion grossly negligent.

For instance, GM concealed an ignition flaw that resulted in 13 deaths.  Look what's happening now in that sector.  Revelations of corporate wrongdoing and Congressional investigations:

http://www.nytimes.com/2014/03/31/business/us-regulators-declined-full-inquiry-into-gm-ignition-flaws-memo-shows.html

U.S. Agency Knew About G.M. Flaw but Did Not Act
By MATTHEW L. WALD
MARCH 30, 2014

Federal regulators decided not to open an inquiry on the ignitions of Chevrolet Cobalts and other cars even after their own investigators reported in 2007 that they knew of four fatal crashes, 29 complaints and 14 other reports that showed the problem disabled air bags, according to a memo released by a House subcommittee on Sunday.

Then in 2010, the safety agency came to the same decision after receiving more reports that air bags were not deploying.

The memo also revealed that General Motors approved the faulty design of the switch in 2002 even though the company that made the part, Delphi, warned the automaker that the switch did not meet specifications. This followed a warning the year before — when the Saturn Ion was being developed — but G.M. said that “a design change had solved the problem,” according to the memo.
Continue reading the main story
Related Coverage

The striking new details in the memo bolster the contention that both G.M. and the National Highway Traffic Safety Administration, more than previously acknowledged, ignored or dismissed warnings for more than a decade about a faulty ignition switch that, if bumped, could turn off, shutting the engine and disabling the air bags. General Motors has recalled nearly 2.6 million cars and has linked 13 deaths to the defect.

13 deaths, and we get this:

A House subcommittee, which gathered more than 200,000 pages of documents from G.M. and 6,000 pages from the agency, will hold a hearing on Tuesday. Mary T. Barra, General Motors’ chief executive, and David Friedman, the acting administrator of the safety agency, are scheduled to testify. Both are also scheduled to testify before a Senate panel on Wednesday.

“The problems persisted over a decade, the red flags were many, and yet those responsible failed to connect the dots,” Fred Upton, a Republican of Michigan and the chairman of the House Energy and Commerce Committee, said in a statement.

The most damaging finding in the memo concerned the four fatal crashes that went unheeded by regulators.

In a presentation dated Nov. 17, 2007, the safety agency’s investigators reported to its Office of Defects Investigation on the fatal crashes, as well as broad range of complaints and other reports about cars shutting off.

“The panel did not identify any discernible trend and decided not to pursue a more formal investigation,” the House memo said. The findings reinforce an analysis by The New York Times published March 8 that found the agency had received more than 260 complaints about Cobalts, Ions and other cars in the recall, citing potentially dangerous shutdowns.

While GM is being grilled and raked over the coals, the health IT industry is likely going to get the healthcare industry's most generous and unprecedented special regulatory accommodation by FDA due to its products being deemed of "sufficiently low risk" (?) not to merit attention.

--------------------------------------------

The FDA statement is particularly reckless (or worse) given the following points.

I add that the following are just highlights off the top of my head, and non-inclusive of all known facts on health IT risk:

(1)  FDA's own 2010 "Internal Memo on HIT Risk", "not intended for public use" but exposed by Center for Public Integrity investigative reporter Fred Schulte (http://www.publicintegrity.org/authors/fred-schulte) when he was with the Huffington Post Investigative Fund (as described at http://hcrenewal.blogspot.com/2010/08/smoking-gun-internal-fda-memorandum-of.html), stated:
 

...In summary, the results of this data review suggest significant clinical implications and public safety issues surrounding Health Information Technology. The most commonly reported H-IT safety issues included wrong patient/wrong data, medication administration issues, clinical data loss/miscalculation, and unforeseen software design issues; all of which have varying impact on the patient’s clinical care and outcome, which included 6 death and 43 injuries. The absence of mandatory reporting enforcement of H-IT safety issues limits the number of relevant MDRs [device reports] and impedes a more comprehensive understanding of the actual problems and implications.

They then go on to categorize the known risk factors and the impediments to knowing about their occurrence: 

Limitations of the MAUDE search and final subset of MDRs include the following: 

1. Not all H-IT safety issue MDRs can be captured due to limitations of reporting practices including
...(a) Vast number of H-IT systems that interface with multiple medical devices currently assigned to multiple procodes making it difficult to identify specific procodes for H-IT safety issues;
...(b) Procode assignments are also affected by the ability of the reporter/contractor to correctly identify the event as a H-IT safety issue;
...(c) Correct identification by the reporter of the suspect device brand name is challenged by difficulties discerning the actual H-IT system versus the device it supports.
2. Due to incomplete information in the MDRs, it is difficult to unduplicate similar reports, potentially resulting in a higher number of reports than actual events.
3. Reported death and injury events may only be associated with the reported device but not necessarily attributed to the device.
4 Correct identification by the reporter of the manufacturer name is convoluted by the inability to discern the manufacturer of the actual H-IT system versus the device it supports.
5 The volume of MDR reporting to MAUDE may be impacted by a lack of understanding the reportability of H-IT safety issues and enforcement of such reporting.

In other words, FDA does not know the level of risk due to systematic impediments to information diffusion.


Internal FDA Memo ("not intended for public use") on potential dangers of health IT.  Download the full PDF by clicking here.

I think it axiomatic to state that it is, at best, seriously misleading to now state a technology is of "low risk" while internally having admitted that the risk level is not known due to impediments to that knowledge.  That situation has not changed since this memo's internal FDA dissemination, and its disclosure by Schulte.

This is inexplicable on its face on scientific grounds.

Further, another HHS agency, the Agency for Healthcare Research and Quality (AHRQ, http://www.ahrq.gov/) has also taxonomized in detail a rich tapestry of "how health IT can cause harm."  See "AHRQ Health IT Hazards Manager Report" at http://healthit.ahrq.gov/sites/default/files/docs/citation/HealthITHazardManagerFinalReport.pdf and its contained list of harms on pg. 29, reproduced below.

AHRQ has no regulatory authority.  These failure modes are expensive to deal with, thus many will likely only be dealt with by the industry (if at all) after patient harms or near-misses.  Patients, however, are not experimental subjects for the debugging and "improvement" of experimental IT systems.

AHRQ Health IT Hazard Manager Report - Hazard Modes of Health IT (click to enlarge)

Surely FDASIA is aware of this.


(2)  FDA CDRH Director Jeff Shuren MD, JD's own statement that the known risks are "the tip of an iceberg" at the HIT Policy Committee, Adoption/Certification Workgroup on February 25, 2010, where the topic was "HIT safety" (The text is available at this link):

...... In the past two years, we [FDA] have received 260 reports of HIT-related malfunctions with the potential for patient harm – including 44 reported injuries and 6 reported deaths. Because these reports are purely voluntary, they may represent only the tip of the iceberg in terms of the HIT-related problems that exist.

Considering the admitted impediments to information diffusion, it's likely even at this point that FDA knew the number of injuries and deaths was far larger than GM's mere 13 ignition defect-related deaths.

(3)   FDA's MAUDE (Manufacturer and User Facility Device Experience) database is a voluntary medical device defects reporting system, but is not specifically purposed for health IT and is almost unknown to typical health IT users as a resource for reporting flaws. 

MAUDE reports show health IT devices are literally festooned with an incredible number of flaws, as if they come from dirt-floor-covered software sweatshops with little oversight.

These are defects that have made it into live hospital and caregiver devices, and have led to reported harms and death (see http://hcrenewal.blogspot.com/2011/01/maude-and-hit-risk-mother-mary-what-in.html).

The voluntary nature of this MAUDE reporting by just a few HIT sellers, and lack of knowledge of it by health IT users as a resource, cannot fail to lead to the conclusion that the problems are far worse than this limited dataset indicates.

(4)  The prestigious U.S. Institute of Medicine of the National Academies (http://www.iom.edu/), perhaps the top scientific authority in this country, also admitted the risk levels of health IT were unknown due to systematic impediments to knowing.  IOM noted this in their late 2011 study on EHR safety:

... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks.

The IOM did note, contrary to FDA's current finding that these systems are of "sufficiently low risk", that:

These barriers to generating evidence pose unacceptable risks to safety.[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]

Also in the IOM report:

… “For example, the number of patients who receive the correct medication in hospitals increases when these hospitals implement well-planned, robust computerized prescribing mechanisms and use barcoding systems. But even in these instances, the ability to generalize the results across the health care system may be limited. For other products— including electronic health records, which are being employed with more and more frequency— some studies find improvements in patient safety, while other studies find no effect.

More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.”

IOM could not have stated the reality more explicitly than via their statement "the magnitude of the risk associated with health IT is not known."

(5)   Surely FDA (and the FDASIA committees responsible for the new Report) are aware of  the ECRI Institute's Deep Dive study results from a study they carried out within their PSO-member hospitals (http://hcrenewal.blogspot.com/2013/02/peering-underneath-icebergs-water-level.html).  I would call 171 voluntarily-reported health IT-related mishaps in 36 hospitals over 9 weeks, with 8 injuries and 3 possible deaths, an alarming study.

In fact, in 2014 based on their PSO data, the ECRI Institute identified health IT as #1 of the "Top Ten Technology Risks in Healthcare" (http://hcrenewal.blogspot.com/2014/04/in-ecri-institutes-new-2014-top-10.html):

CONCERN #1: Data integrity failures with health information technology systems 

With the federal government offering financial incentives for hospitals and physician practices to adopt EHR systems, use of these systems more than tripled from 2009 through 2012. “Health IT systems are very complex,” says James P. Keller, M.S., vice president, technology evaluation and safety, ECRI Institute. “They are managing a lot of information, and it’s easy to get something wrong” if the systems are not designed and implemented well. While appropriately designed and implemented systems can provide complete, current, and accurate patient care information so that the clinician can make appropriate treatment decisions, the presence of incorrect data can lead to incorrect treatment, potentially leading to patient harm.

Karen Zimmer, MD, medical director of the ECRI Institute, says the reports of so many types of errors and harm got the staff's attention in part because the program captured so many serious errors within just a nine-week project last spring.  The volume of errors in the voluntary reports was she says:

"an awareness raiser.  If we're seeing this much under a voluntary reporting program, we know this is just the tip of the iceberg; we know these events are very much underreported" (http://www.healthleadersmedia.com/print/TEC-290834/HIT-Errors-Tip-of-the-Iceberg-Says-ECRI).

I certainly would NOT aver that this technology's risk level is of "sufficiently low risk" to not warrant formal regulation by a body skilled in software regulation, as is FDA., e.g., for medical devices and for IT used in pharmaceutical clinical trials data management and drug manufacturing.

See for example "General Principles of Software Validation; Final Guidance for Industry and FDA Staff" at http://www.fda.gov/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm085281.htm.

(I would really not enjoy being in a position to have to make such an averment about "sufficiently low risk" under GM ignition investigation-like conditions... )

(6)  Surely FDA (and the FDASIA committees responsible for the new Report) are aware of the data reported by the medical malpractice insurer CRICO, that insures the Harvard medical community (see http://hcrenewal.blogspot.com/2014/02/patient-safety-quality-healthcare.html):

... CRICO recently analyzed a year’s worth of medical malpractice claims in its comparative database [from numerous sources] and found 147 cases in which EHRs were a contributing factor. Computer systems that don’t “talk” to each other, test results that aren’t routed properly, and mistakes caused by faulty data entry or copying and pasting were among the EHR-related problems found in the claims, which represented $61 million in direct payments and legal expenses.  ... Half of the 147 cases resulted in severe injury.  [Death numbers are unstated - ed.]

Note that these numbers were an examination of lawsuits filed; most medical malpractice never gets to the point of an actual filed lawsuit due to the unfavorable economics of the medical malpractice industry.  Perhaps 5% of actual events ever get filed, thus CRICO's data is also a "tip of an iceberg."

(7)  Surely FDA (and the FDASIA committees responsible for the new Report) are aware of mass risk introduced by faulty systems such as in the incident in Rhode Island, at Lifespan Healthcare (http://hcrenewal.blogspot.com/2011/11/lifespan-rhode-island-yet-another.html), where thousands of prescriptions were altered by faulty HIT without physician knowledge.  Suffixes for the long acting version of a drug (e.g., "XL" or "XR") were dropped:

PROVIDENCE, R.I. (WPRI) - Rhode Island State Senator Jamie Doyle says he is shocked to hear a Lifespan computer glitch caused thousands of patients to receive the wrong types of medication. [Appx. 2,000 across five Lifespan hospitals according to the Providence Journal, see below, and the WaPo - ed.]

Doyle is now calling for an independent review of all the hospitals Lifespan runs, and a review of the Rhode Island Department of Health.

The DOH is investigating after learning patients who were supposed to receive medications taken once a day instead received medications meant to be taken more than once per day.  [In other words, they were taking short acting drugs on a long-acting once a day schedule, thus being massively under-dosed - ed.]

(8)  The FDA (and the FDASIA committees responsible for the new Report) surely cannot be unaware of all of the "glitches" reported on in-situ health IT systems, as cataloged at this blog under the tag "glitch" and as of this writing numbering 25 posts, some including patient deaths.  See the results of the query link: http://hcrenewal.blogspot.com/search/label/glitch.

(9) Surely FDA (and the FDASIA committees responsible for the new Report) cannot be unaware that FDA itself had to recall health IT with flaws "serious enough to cause patient death":

FDA recall.  McKesson Anesthesia Care – Patient Case Data May Not Match Patient DataUse of this affected product may cause serious adverse health consequences, including death. 
http://hcrenewal.blogspot.com/2014/03/ehr-recall-use-of-this-affected-product.html

FDA Recalls Draeger Health IT Device Because "This Product May Cause Serious Adverse Health Consequences, Including Death"
http://hcrenewal.blogspot.com/2011/12/fda-recalls-health-it-software-because.html

FDA Recall: Philips Xcelera Connect - Incomplete Information Arriving From Other Systems
http://hcrenewal.blogspot.com/2012/07/health-it-fda-recall-philips-xcelera.html

FDA recall:  An ED EHR "Glitch" That Mangles Prescriptions
http://hcrenewal.blogspot.com/2013/08/a-good-way-to-cynernetically-harm-or.html

(10)  FDA (and the FDASIA committees responsible for the new Report) are surely not unaware of my own report to its MAUDE database (because the hospital denied my request to issue the report itself) of a fundamental defect in a widely used health IT system, Eclipsys Sunrise, that nearly caused the death of my mother... AFTER she was seriously injured due to a flaw in an ED's EHR that did lead a year later to her death:

http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfmaude/detail.cfm?mdrfoi__id=1729552

ECLIPSYS SUNRISE CLINICAL MANAGER      
Event Date 06/12/2010
Event Type  Injury   Patient Outcome  Hospitalization,Other,Life Threatening,Required Intervention
Event Description

The eclipsys sunrise clinical manager emr/cpoe is in use at (b) (6) hospital. This system appears to have a serious deficiency: an apparent lack of date constraint validity checking when users are seeking information on prior medication orders and administration. (b) (6), (b) (6) was admitted (b) (6) 2010, for cerebrovascular problems. Famotidine iv was started prophylactically to prevent gastric problems, but was discontinued and changed to protonix a few days later due to suspicion of causing severe agitation and confusion, causing her to pull her lines. Around (b) (6) 2010, patient was transferred to physical therapy floor in the hospital. Hospital considers such a transfer to be a "discharge" and "readmission. " however, patient did poorly, had to be sent back to med-surg floor several days later. Due to having pulled a percutaneous endoscopic gastrostomy tube -peg tube-, she was started on TPN on (b) (6) 2010. In the first bag of tpn solution was 40 mg. Famotidine. Patient's mental status began deteriorating and she began to get very, very agitated and confused once again. Eclipsys sunrise problem/bug was noted when son, a physician informaticist -and author of this report- noted the 40 mg famotidine in the tpn bag, and reported to the floor rn that this was contraindicated. The rn pulled up the medication order and administration records on the eclipsys sunrise clinical manager system to confirm famotidine had been administered previously. Famotidine was shown only as being ordered for tpn on (b) (6), but not prior. The rn selected a date constraint entry box, at which time a calendar widget popped up. The rn set the date constraint back to (b) (6), then (b) (6), then (b) (6) 2010, but the use of the famotidine at that time did not appear. Son of patient demanded the famotidine be stopped anyway, which it was, and that an allergy entry be entered. However, it was fortunate son was a physician and remembered the name of the medicine. In tracing this error today, it was found that the data from admission on (b) (6) to "discharge" to physical therapy floor (b) (6) 2010, was unavailable since it was considered a "prior admission" although the patient had never left the hospital physically. The sunrise system, however, failed to flag the (b) (6) 2010 date constraint entries as 'out of bounds' for the current admission. The validity of the date constraints was not checked, preventing the rn user from realizing he should have called up the prior admissions records. If son of patient had not been a physician, famotidine might have been continued and patient might have suffered more severe consequences than she did after the 40 mg was given in the first tpn bag. As it was, she spent the next three days in a severe delirium and is only slowly recovering. That the eclipsys sunrise ehr did not alert its user that the date constraints they set for seeking medication ordering and administration data were out of bounds -beyond the range of the current "admission" -is a significant and severe oversight and/or bg, and a danger to patients not fortunate enough to have a physician informaticist son closely monitoring their care.

Addendum May 4, 2014:

(11)   CMS: "we do not have any information that supports or refutes claims that a broader adoption of EHRs can save lives."  (But let's spend hundreds of billions of dollars anyway.)



CMS: "we do not have any information that supports or refutes claims that a broader adoption of EHRs can save lives."  [But let's spend hundreds of billions of dollars anyway.]  Click to enlarge.

--------------------------------------------

Note that these ten points themselves, all I had time to write about this morning, themselves represent a "tip of the iceberg" regarding health IT risks.

In my opinion it's immensely reckless for FDA to make a statement that "products with health management heath IT functions, includes software for health information and data management, medication management, provider order entry, knowledge management, electronic access to clinical results and most clinical decision support software ... are of sufficiently low risk" so as to not warrant formal regulation that is already in place for other healthcare software sectors.

Fortunately, the public gets to comment via this announcement:

Proposed Risk-Based Regulatory Framework and Strategy for Health Information Technology Report; Notice to Public of Availability of the Report and Web Site Location; Request for Comments
http://www.regulations.gov/#!documentDetail;D=FDA_FRDOC_0001-4678

I feel, however, that the decision is a given, and that the health IT industry has such regulatory capture that public comments will be ignored.  (Not that I won't try.)

Finally:

I ponder what ever happened to the inquiry responses about health IT defects and harm sought by the Senate Finance Committee, as reported in the Washington Post:

Electronic medical records draw frequent criticisms
By Alexi Mostrous
Washington Post Staff Writer
Sunday, October 25, 2009

http://www.washingtonpost.com/wp-dyn/content/article/2009/10/24/AR2009102400967.html?hpid%3Dtopnews&sub=AR

... the Senate Finance Committee has amassed a thick file of testimony alleging serious computer flaws from doctors, patients and engineers unhappy with current systems.

On Oct. 16, the panel wrote to 10 major sellers of electronic record systems, demanding to know, for example, what steps they had taken to safeguard patients. "Every accountability measure ought to be used to track the stimulus money invested in health information technology," said Sen. Charles E. Grassley (Iowa), the panel's ranking Republican.

The results of that request seem to have gone to the recycle bin.

-- SS

Thursday, February 28, 2013

Peering Underneath the Iceberg's Water Level: AMNews on the New ECRI "Deep Dive" Study of Health IT "Events"

FDA's Center for Devices and Radiological Health director Jeffrey Shuren MD JD voiced the opinion a few years ago that what FDA knows about health IT risks is the "tip of the iceberg" due to systematic impediments to knowledge gathering and diffusion.   See links to source here and to the FDA Internal Memo on HIT risk - labeled "internal document not intended for public use" and unearthed by investigative reporter Fred Schulte several years ago - here (PDF).

At my Feb. 9, 2013 post "A New ECRI Institute Study On Health Information Technology-Related Events" I opined that a new ECRI study was beginning to peer beneath the waterline of Jeff Shuren's iceberg tip, at what may reside underneath that waterline.  Iceberg tips, needless to say, are usually tiny compared to the iceberg's overall size.

Reporter Kevin O'Reilly at AMNews (amednews.com) has now written about that ECRI report.

The results of the report are concerning:

 Ways EHRs can lead to unintended safety problems

Wrong records and failures in data transfer impede physicians and harm patients, according to an analysis of health technology incidents.

By Kevin B. O'Reilly, amednews staff,
posted Feb. 25, 2013.

In spring 2012, a surgeon tried to electronically access a patient’s radiology study in the operating room but the computer would show only a blue screen. The patient’s time under anesthesia was extended while OR staff struggled to get the display to function properly.

That is just one example of 171 health information technology-related problems reported during a nine-week period to the ECRI Institute PSO, a patient safety organization in Plymouth Meeting, Pa., that works with health systems and hospital associations in Kentucky, Michigan, Ohio, Tennessee and elsewhere to analyze and prevent adverse events.

Eight of the incidents reported involved patient harm, and three may have contributed to patient deaths, said the institute’s 48-page report, first made privately available to the PSO’s members and partners in December 2012. The report, shared with American Medical News in February, highlights how the health IT systems meant to make care safer and more efficient can sometimes expose patients to harm.

 Mar. 1, 2013 addendum.  From ECRI, the denominator is this:


Participating facilities submitted health IT related events during the nine-week period starting April 16, 2012, and ending June 19, 2012. ECRI Institute PSO pulled additional health IT events that were submitted by facilities during the same nine-week period as part of their routine process of submitting event reports to ECRI Institute PSO’s reporting program. The PSO Deep Dive analysis consisted of 171 health IT-related events submitted by 36 healthcare facilities, primarily hospitals.   [I note that's 36 of 5,724 hospital in the U.S. per data from the American Hospital Association (link), or appx. 0.6 %.  A very crude correction factor in extrapolation would be about x 159 on the hospital count issue alone, not including the effects of the voluntary nature of the study, of non-hospital EHR users, etc.  Extrapolating from 9 week to a year, the figure becomes about x 1000.  Accounting for the voluntary nature of the reporting (5% of cases per Koppel), the corrective figure approaches x20,000.  Extrapolation of course would be less crude if # total beds, degree of participant EHR implementation/use, and numerous other factors were known, but the present reported numbers are a cause for concern - ed.]

Sept. 2013 addendum: 

Health Leaders Media has more on the ECRI Deep Dive study at http://www.healthleadersmedia.com/print/TEC-290834/HIT-Errors-Tip-of-the-Iceberg-Says-ECRI:

HIT Errors 'Tip of the Iceberg,' Says ECRI
Cheryl Clark, for HealthLeaders Media , April 5, 2013

Healthcare systems' transitions from paper records to electronic ones are causing harm and in so many serious ways, providers are only now beginning to understand the scope.

Computer programs truncated dosage fields, leading to morphine-caused respiratory arrest; lab test and transplant surgery records didn't talk to each other, leading to organ rejection and patient death; and an electronic systems' misinterpretation of the time "midnight" meant an infant received antibiotics one dangerous day too late.

These are among the 171 health information technology malfunctions and disconnects that caused or could have caused patient harm in a report to the ECRI Institute's Patient Safety Organization.

... The 36 hospitals that participated in the ECRI IT project are among the hospitals around the country for which ECRI serves as a Patient Safety Organization, or PSO.

The 171 events documented, break down like this:
  • 53% involved a medication management system.
    • 25% involved a computerized order entry system
    • 15% involved an electronic medication administration record
    • 11% involved pharmacy systems
    • 2% involved automated dispensing systems
  • 17% were caused by clinical documentation systems
  • 13% were caused by Lab information systems
  • 9% were caused by computers not functioning
  • 8%. Were caused by radiology or diagnostic imaging systems, including PACS
  • 1% were caused by clinical decision support systems

Karen Zimmer, MD, medical director of the institute, says the reports of so many types of errors and harm got the staff's attention in part because the program captured so many serious errors within just a nine-week project last spring.  The volume of errors in the voluntary reports was she says, "an awareness raiser."

"If we're seeing this much under a voluntary reporting program, we know this is just the tip of the iceberg; we know these events are very much underreported."

As at the opening of this post, "tip of the iceberg" is a phrase also used by FDA CDRH director Jeffrey Shuren MD JD regarding safety issues with EHRs and other health IT.

Along those lines, at my April 2010 post "If The Benefits Of Healthcare IT Can Be Guesstimated, So Can And Should The Dangers" I proposed a "thought experiment" to theoretically extrapolate limited data on health IT risk to a national audience, taking into account factors that limited transparency and thus reduced known injury and fatality counts. The results were undesirable, to say the least - but it was a thought experiment only.

Using the current data, coming from a limited, voluntary set of information over 9 weeks, I opine that the results of an extrapolation to a national (or worldwide) level, in an environment of rapidly increasing adopters (many of whom are new to the technology), on an annual basis, not a mere 9 weeks - would not look pretty.

The institute’s report did not rate whether electronic systems were any less safe than the paper records they replaced. The report is intended to alert hospitals and health systems to the unintended consequences of electronic health records.

Ethically, this is really not relevant towards national rollout, especially with penalties beginning to accrue to non-adopters of HHS "Certified" technology in a few years.

As I've written on this blog, medical ethics generally do not condone experimentation without informed consent, especially when the experimental devices are of unknown risk. Not knowing the risks of IT, it really doesn't matter, ethically, what the safety of paper is.  "Hope" is not a valid reason for medical experimentation.  (See below for what a PubMed search reveals about risks of paper records.)

The unspoken truth prevalent in healthcare today seems to be this:  the sacrifice of individual patients to a technology of unknown risk is OK, as long as - we hope -  it advances the greater good.    Perhaps that should be explicitly admitted by the HIT industry's hyper-enthusiast proponents who ignore the downsides, so the spin can be dropped and there can be clarity?

The leading cause of problems was general malfunctions [also known by the benign-sounding euphemism "glitches" - ed.]  responsible for 29% of incidents. For example, following a consultation about a patient’s wounds, a nurse at one hospital tried to enter instructions in the electronic record, but the system would not allow the nurse to type more than five characters in the comment field. Other times, medication label scanning functions failed, or an error message was incorrectly displayed every time a particular drug was ordered. One system failed to issue an alert when a pregnancy test was ordered for a male patient. [These 'general malfunctions' are thus not just computer bugs undetected due to inadequate pre-rollout testing, but also examples of design flaws due to designer-programmer-seller-buyer-implementer lack of due diligence, i.e.,  negligence - ed.]

A quarter of incidents were related to data output problems, such as retrieving the wrong patient record because the system does not ask the user to validate the patient identity before proceeding. This kind of problem led to incorrect medication orders and in one case an unnecessary chest x-ray. Twenty-four percent of incidents were linked to data-input mistakes. For example, one nurse recorded blood glucose results for the wrong patient due to typing the incorrect patient identification number to access the record.  [Many of these are likely due to what NIST has termed "use error" - user interface designs that will engender users to make errors of commission or omission - as opposed to "user error" i.e., carelessness - ed.]

Most of remaining event reports were related to data-transfer failures, such as a case where a physician’s order to stop anticoagulant medication did not properly transfer to the pharmacy system. The patient received eight extra doses of the medication before it was stopped. [Due to outright software, hardware and/or network problems and defects - ed.]

I've been writing about such issues since 1998, not because I imagined them.  As a CMIO I saw them firsthand; as teacher and mentor I heard about them from colleagues; as a writer I heard about them via (usually unsolicited) emails from concerned clinicians; as an independent expert witness on health IT harms I've heard about them from Plaintiff's attorneys, but not from the Defense side of the Bar as yet.  Of course the reasons for that are understandable -  albeit disappointing.

In fact, robust studies of a serious issue - the actual risks of paper towards harm causation - and further, whether any of the issues are remediable without spending hundreds of billions of dollars on IT - seem scarce.  I've asked the PA Patient Safety Authority about the possibility of using data in the Pennsylvania Patient Safety Reporting System (PA-PSRS) database, just as they did for EHR-related medical events, to determine incidence of paper-related medical events.  They are pondering the issue.

As an aside, I note that it would be ironic if the relative risks of both IT and paper were not really robustly known.  (I note that in a PubMed search on "risks of paper medical records", not much jumps out.)  IT hyper-enthusiasts will not even debate the issue of studying whether a good paper system might be safer for patients in some clinical environments than bad health IT.

Considering the tremendous cost and unknown risk of today's health IT (and perhaps the unknown risk of paper, too), would it not make more sense, and be consistent with the medical Oath, to leave paper in place where it is currently used - and perhaps improve its performance - until we "get the IT right" in controlled, sequestered environments, prior to national rollout?

In other words, as I've asked before on these pages, should we not slow down the IT push and adhere to traditional (and hard-learned) cautions on medical research?

Even asking such questions brings forth logical fallacies such as straw arguments (e.g., UCSF's Bob Wachter in a recent discussion I initiated with several investigative reporters: "...where we part ways is your defense of paper and pencil. I understand advocacy, and you have every right to bang this particular drum"), ad hominem attacks, etc.

... It is not enough for physicians and other health care leaders to shop carefully for IT systems, the report said. Ensuring that systems such as computerized physician order entry and electronic health records work safely has to be a continuing concern, said Karen P. Zimmer, MD, MPH, medical director of the ECRI Institute PSO.

“Minimizing the unintended consequences of health IT systems and maximizing the poten­tial of health IT to improve patient safety should be an ongoing focus of every health care organization,” she said.

I recommended that clinicians take matters into their own hands if their leaders do not, as at the bottom of my post here.  This advice bears repeating:

... When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

(DISCLAIMER:  I am not responsible for any adverse outcomes if any organizational policies or existing laws are broken in doing any of the following.)


  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.
  • As clinicians are often forced to use health IT, at their own risk even when "certified" (link), if a healthcare organization or HIT seller is sluggish or resistant in taking corrective actions, consider taking another risk (perhaps this is for the very daring or those near the end of their clinical career). Present your organization's management with a statement for them to sign to the effect of:
"We, the undersigned, do hereby acknowledge the concerns of [Dr. Jones] about care quality issues at [Mount St. Elsewhere Hospital] regarding EHR difficulties that were reported, namely [event A, event B, event C ... etc.]

We hereby indemnify [Dr. Jones] for malpractice liability regarding patient care errors that occur due to EHR issues beyond his/her control, but within the control of hospital management, including but not limited to: [system downtimes, lost orders, missing or erroneous data, etc.] that are known to pose risk to patients. We assume responsibility for any such malpractice.

With regard to health IT and its potential negative effects on care, Dr. Jones has provided us with the Joint Commission Sentinel Events Alert on Health IT at http://www.jointcommission.org/assets/1/18/SEA_42.PDF, the IOM report on HIT safety at http://www.modernhealthcare.com/Assets/pdf/CH76254118.PDF, and the FDA Internal Memorandum on H-IT Safety Issues at http://www.scribd.com/huffpostfund/d/33754943-Internal-FDA-Report-on-Adverse-Events-Involving-Health-Information-Technology.

CMO __________ (date, time)
CIO ___________ (date, time)
CMIO _________ (date, time)
General Counsel ___________ (date, time)
etc."
  • If the hospital or organizational management refuses to sign such a waiver (and they likely will!), note the refusal, with date and time of refusal, and file away with your attorney. It could come in handy if EHR-related med mal does occur.
  • As EHRs remain experimental, I note that indemnifications such as the above probably belong in medical staff contracts and bylaws when EHR use is coerced.

These recommendations still stand, although after this recent story, my caution about retaliation should be re-emphasized:

The Advisory Board Company
Feb. 14, 2013
Hospital Framed Physician; Planted a Gun

-- SS

Sunday, February 28, 2010

FDA on Health IT Adverse Consequences: 44 Reported Injuries And 6 Deaths In Two Years, Probably Just 'Tip of Iceberg'

The Office of the National Coordinator for Health IT held a meeting of the HIT Policy Committee, Adoption/Certification Workgroup on February 25, 2010. The topic was "HIT safety." The agenda, presenters and presentations are available at this link.

At this meeting FDA testimony was given by Jeffrey Shuren, Director of FDA’s Center for Devices and Radiological Health. Dr. Shuren noted several categories of health IT-induced adverse consequences known by FDA. This information was striking:

He wrote:

... In the past two years, we have received 260 reports of HIT-related malfunctions with the potential for patient harm – including 44 reported injuries and 6 reported deaths. Because these reports are purely voluntary, they may represent only the tip of the iceberg in terms of the HIT-related problems that exist.
[I'd call that a likely understatement - ed.]

Even within this limited sample, several serious safety concerns have come to light. The reported adverse events have largely fallen into four major categories: (1) errors of commission, such as accessing the wrong patient’s record or overwriting one patient’s information with another’s; (2) errors of omission or transmission, such as the loss or corruption of vital patient data; (3) errors in data analysis, including medication dosing errors of several orders of magnitude; and (4) incompatibility between multi-vendor software applications and systems, which can lead to any of the above.


This is a technology almost universally touted as inherently beneficial, right up to our most senior elected leaders, who are now pushing this unproven technology under threat of penalty for non-adopters - certainly a precedent, especially in a supposedly democratic country. I have given examples on this blog about how this belief about the universal goodness of healthcare computing is itself inherently idealistic - and unrealistic.

Now, here are some very striking, discrete examples of HIT-related adverse consequences, the tip of a larger iceberg, size unknown but quite possibly very large (where's Kate Winslet when you need her?):


(1) Errors of Commission

Example 1: An error occurred in software used to view and document patient activities. When the user documented activities in the task list for one patient and used the “previous” or “next” arrows to select another patient chart, the first patient’s task list displayed for the second patient.

Example 2: A nuclear medicine study was saved in the wrong patient’s file. Investigation suggested that this was due to a software error.

Example 3: A sleep lab’s workstation software had a confusing user interface, which led to the overwriting and replacement of one patient’s data with another patient’s study.
[I covered other examples of confusing or "mission hostile" interfaces at an eight part series here - ed.]


(2) Errors of Omission or Transmission


Example 1: An EMR system was connected to a patient monitoring system to chart vital signs. The system required a hospital staff member to download the vital signs, verify them, and electronically post them in the patient’s chart. Hospital staff reported that, several times, vital signs have been downloaded, viewed, and approved, and have subsequently disappeared from the system.

Example 2: An operating room management software application frequently “locked up” during surgery, with no obvious indication that a “lock-up” was occurring. Operative data were lost and had to be re-entered manually, in some cases from the nurse’s recollection. [I experienced similar problems: a decade ago - ed.]

Example 3: An improper database configuration caused manual patient allergy data entries to be overwritten during automatic updates of patient data from the hospital information system.


(3) Errors in Data Analysis


Example 1: In one system, intravenous fluid rates of greater than 1,000 mL/hr were printed as 1 mL/hr on the label that went to the nursing / drug administration area.

Example 2: A clinical decision support software application for checking a patient’s profile for drug allergies failed to display the allergy information properly. Investigation by the vendor determined that the error was caused by a missing codeset.

Example 3: Mean pressure values displayed on a patient’s physiological monitors did not match the mean pressures computed by the EMR system after systolic and diastolic values were entered.


(4) Incompatibility between Multi-Vendor Software Applications or Systems


Example 1: An Emergency Department management software package interfaces with the hospital’s core information system and the laboratory’s laboratory information system; all three systems are from different vendors. When lab results were ordered through the ED management software package for one patient, another patient’s results were returned.

Example 2: Images produced by a CT scanner from one vendor were presented as a mirror image by another vendor’s picture archiving and communication system (PACS) web software. The PACS software vendor stipulates that something in the interface between the two products causes some images to be randomly “flipped” when displayed.



The above is from FDA (2010 internal memo here) and by their own admission under-represents the problems, perhaps massively. Most of these errors are inexcusable from an engineering and quality perspective.

Unfortunately, there actually is no comprehensive data on the true magnitude of the problems.


44 injuries and 6 deaths: tip of the iceberg?
Merely writing about this scarcity can bring opposition. For instance, a recent paper I wrote about lack of data on unintended adverse consequences of HIT and remediation of the data paucity was rejected. (The paucity itself might be considered an 'unintended consequence' of HIT, since secrecy about pros/cons of HIT was not the intention of the medical informatics pioneers.)
One of the reasons given by an anonymous reviewer justifying rejection of the paper was itself striking:

The paper "adds little that is new or that goes beyond what a reader might find in a major city newspaper", the reviewer wrote.

Little that is new to whom, exactly? Where are the extant papers on the scarcity of such data?
I am also highly uncertain as to which "major city newspapers" the reviewer was referring to, as I've rarely if ever read articles in newspapers about the paucity of data on adverse consequences of HIT or the remediation of the paucity.

One reason for writing the paper was due to the fact the "major newspapers" - and the medical and medical informatics journals - largely avoid such issues entirely.

(The scarcity was noted by organizations such as the Joint Commission, however, in their 2009 Sentinel Events Alert on HIT - "There is a dearth of data on the incidence of adverse events directly caused by HIT overall.")

This reviewer continued with the frivolous comment that "proposing a classification of sources of unintended consequence and analysis of reasons for undereporting of each type in the resulting classification could be a useful addition to the field."
Ironically, I actually devoted an entire section of the paper to reasons for under-reporting and scarcity of unintended consequences, broadly speaking, although that this wasn't the paper's main purpose. Its purpose was to point out the dangers inherent in such an information scarcity. It is also hard to granularly classify variants of a phenomenon on which there is scarce data to begin with.

Rather than revise the paper, I may simply put it in the public domain and send it to my elected representatives involved in healthcare IT policy. I'd done this with another paper I'd written in 2007 on EMR's and postmarketing drug surveillance that had received a mysteriously similar "could have read this in any newspaper" critique.

In summary, the light is starting to shine on HIT dangers. It is also increasingly recognized by regulators such as FDA that "data scarcity" is a problem of major significance ("tip of the iceberg"), although there are those in this sector who would prefer to keep physicians and patients in the dark on this issue and keep such data scarce.
Finally, while Shuren presented a number of options regarding FDA involvement in HIT regulation, he wrote that "in light of the safety issues that have been reported to us, we believe that a framework of federal oversight of HIT needs to assure patient safety." This itself represents a major change in the culture of HIT.
Addendum: the Huffington Post Investigative Fund wrote about this meeting in an article entitled "Experts: Safety Oversight Needed as Patient Records Go Digital" here.
-- SS


For more on HIT challenges see "Contemporary Issues in Medical Informatics: Common Examples of Healthcare Information Technology Difficulties" - http://www.ischool.drexel.edu/faculty/ssilverstein/cases/