Pages

Wednesday, August 04, 2010

More on Huffington Post Investigative Fund: "FDA, Obama Digital Medical Records Team at Odds over Safety Oversight"

Re: today's Huffington Post Investigative Fund article "FDA, Obama Digital Medical Records Team at Odds over Safety Oversight."

(I'd written some preliminary comments at an earlier post entitled "Huffington Post Investigative Fund: FDA, Obama Digital Medical Records Team at Odds over Safety Oversight.")

First, some relatively obvious questions about the Cerner health IT crashes at the Trinity Health System chain of hospitals featured in the story:

  • How many patients were affected? Is the number actually known?
  • Were affected patient charts corrected?
  • What restrictions, if any, have been placed on physicians, other clinicians, employees, contractors, staff, etc. about speaking to the press on the Trinity Health HIT malfunctions?
  • If any restrictions were placed, are they in violation of Joint Commission Safety Standards as in my July 22, 2009 JAMA letter to the editor "Health Care Information Technology, Hospital Responsibilities, and Joint Commission Standards" at this link?
  • Did this healthcare system sign "hold harmless" clauses with Cerner, as per Koppel and Kreda's 2009 JAMA article "Health Care Information Technology Vendors' Hold Harmless Clause - Implications for Patients and Clinicians, JAMA 2009;301(12):1276-1278, at this link?
  • Did this healthcare system sign a gag clause with Cerner, the vendor of their affected systems according to the Huffington Post article?
  • Medical adverse events from medical record errors can occur quickly, or be more delayed. If patient harm comes of the IT errors that occurred, will the healthcare system file a public report?

Now, on to some specific comments and observations on the Huffington Post Investigative Fund article.

The Huffington Post article begins:

Computers at a major Midwest hospital chain went awry on June 29, posting some doctors’ orders to the wrong medical charts in a few cases and possibly putting patients in harm’s way.

The digital records system “would switch to another patient record without the user directing it to do so,” said Stephen Shivinsky, vice-president for corporate communications at Trinity Health System. Trinity operates 46 hospitals, most in Michigan, Iowa and Ohio.

The bolded passage sounds like obfuscatory PR language, as if the EMR system changed the window focus to another patient's window (people do this all the time when they have multiple windows open in a GUI). If so, that would not be a major problem - the user would just switch back via clicking on the desired window.

Let's state what it sounds like the problem really was with crystal clarity:

Data a clinician entered into the window of primary focus (say, on Mary Jones) would end up in the electronic record of a window that was not the window of primary focus and in the background (say, that of Tom Smith).

The clinician would be unaware this had occurred, and two errors would then occur. The first error was that appropriate data would be missing for Mary. The second is that inappropriate data would be present for Tom. These events put both patients at risk of medical error.

Less than two weeks later, an unrelated glitch caused Trinity to shut down its $400 million system for four hours at 10 hospitals in the network because electronic pharmacy orders weren’t being delivered to nurses for dispensing to patients, he said.

Was this 'glitch' [a code word for "we are not in control of our system, it is in control of us" - ed.] related to the problem above? If so, that would suggest major system problems deep within its code. If not, it suggests lax overall quality control of this IT.

“As soon as it was brought to our attention, we moved to fix the problem,” Shivinsky said of Trinity’s system

This statement says nothing. One would not expect them to wait to fix a potentially dangerous problem.

He said nobody was injured in either event, ...

As in my prior post, the correct statement would be:

Nobody was injured, yet, due to the errors.

... the Cerner Corp. system now works properly,

Does that mean all the problems are fixed? The article implies not via its ending where Shivinsky is quoted that:

... Meanwhile technicians are still trying to figure out the root cause. “We’ll get to the bottom of it and fix it,” he said.

By the way, I note the following employee reporting site on Cerner's environment: link. Could this be a factor in Richard Granger, the former head of the UK's NHS national IT programme saying that:

"Sometimes we put in stuff that I'm just ashamed of ... Some of the stuff that Cerner has put in recently is appalling ... Cerner and prime contractor Fujitsu had not listened to end users ... Failed marriages and co-dependency with subcontractors ... A string of problems ranging from missing appointment records, to inability to report on wait times ... Almost a dozen cancelled go-live dates ... Stupid or evil people ... Stockholm syndrome -identifying with suppliers' interests rather than your own ... A little coterie of people out there who are "alleged experts" who were dismissed for reasons of non-performance."

I can only wonder.

... and the hospital chain determined that “technician error” led to the system shutdown and that the mixing up of patients was the result of a “Cerner coding issue” involving software that occurred after an upgrade.

What was the "technician" doing that led to the shutdown, what was the technician's background and qualifications to be doing it to a mission critical medical device?

Further, what, exactly, was the "Cerner coding issue?" Was it Cerner's fault? Trinity's? Both?

Were other organizations using the same software similarly affected?

How will other healthcare organizations learn if the cause of the problems is not revealed?

Even absent any harm to patients, such incidents underscore possible risks faced by even large health organizations that have eagerly embraced new medical software to track patient records and treatment. As the Obama administration ramps up plans to create a digital medical file for every American by 2014 – at an anticipated tab to taxpayers of up to $27 billion – technology’s boosters tend to tout its potential benefits to patients and ability to slow runaway medical costs.

That's not been my observation. Look at the UK for instance:

The UK Public Accounts Committee report on disastrous problems in their £12.7 billion national EMR program is here.

Gateway reviews of the UK National Programme for IT from the Office of Government Commerce (OGC) are here (released under the UK’s Freedom of Information Act), and a summary of 16 key points is here.


I wish there were strong documentary evidence on the latter point about major savings, and that there wasn't evidence to the contrary. The healthcare system doesn't have enough spare capital to throw away on the hopes some touted technology (from which an industry stands to make billions) is a panacea.

Yet despite the high political and financial stakes, the administration has established no national mandatory monitoring procedure for the new devices and software. That no process exists to report and track errors, pinpoint their causes and prevent them from recurring is largely the result of two decades of resistance by the technology industry, a review of government records and interviews by the Huffington Post Investigative Fund shows. The industry argues that even with flaws, digital systems are an improvement over current paper records.

That is no excuse for decades of toleration of the flaws. It is an amoral position. As at my July 28 post "An Open Question on Moral Authority and Healthcare IT", it is playing God.

The HuffPo article continues:

“There’s an assumption that just because you have an electronic system, it’s going to be safer, so people let down their guards,” said Vimla Patel, who directs research on the topic at the University of Texas Health Science Center in Houston.

It's not an "assumption." This meme has been pushed so long by the HIT industry and its irrationally exuberant, uncritical supporters, it's become an accepted statement of fact - as per my July 14 post "Science or Politics? The New England Journal and "The 'Meaningful Use' Regulation for Electronic Health Records" where the Chairman of ONC states it as fact in the New England Journal of Medicine, with nary a footnote to back it up:

Blumenthal - The widespread use of electronic health records (EHRs) in the United States is inevitable. EHRs will improve caregivers’ decisions and patients’ outcomes. Once patients experience the benefits of this technology, they will demand nothing less from their providers. Hundreds of thousands of physicians have already seen these benefits in their clinical practice.

That's pretty definitive. If there's a trace of doubt, I don't see it.

Monitoring could help others learn from problems faced by early users of the technology, which is being sold nationally, or how they were remedied. Shivinsky said he wasn’t sure if federal officials had been notified of the difficulties at Trinity — or would be. No rule requires it.

Why is there no rule? What if this were a medical device such as a CT scanner, or heart defibrillator? Why does health IT get special accommodation when it is now a regulator and governor of clinician communications and actions, placed in between clinician and patient?

Almost a month after the first event at Trinity, David Blumenthal, the government’s top medical health information technology official, didn’t know about it. “First I’ve heard about it,” Blumenthal said when told by a reporter July 20, as he left a Capitol Hill hearing. Since then, Blumenthal has declined to discuss the incident or its implications.

That's perhaps because it conflicts with his other assertions on health IT technological determinism, noted above. How many other "events" is he aware of?

Kelli Christman, a spokesman for Cerner, the manufacturer of the software used at Trinity, did not respond to repeated emails and phone calls over the past week seeking comment.

What are they hiding? Perhaps other affected healthcare systems?

... Many industry groups contend that FDA regulation would “stifle innovation” and stall the national drive to wire up American medicine. That view resonates among the dozens of health information technology experts serving as consultants to Blumenthal’s office and on advisory groups. Blumenthal also has been skeptical of the need for regulation and argued that even if some miscues occur, digital systems are far less prone to error than paper ones.

In an industry with contractual gag clauses and clinician fear of speaking about HIT problems (e.g., of retaliation by hospital officials), how can anyone be sure of this? Is this a scientific statement, or a marketing position?

See may paper "Remediating an Unintended Consequence of Healthcare IT: A Dearth of Data on Unintended Consequences of Healthcare IT" for more on this issue.

“We know that every study and every professional consensus process has concluded that electronic health systems strongly and materially improve patient safety. And we believe that in spreading electronic health records we are going to avoid many types of errors that currently plague the healthcare system,” Blumenthal said when unveiling new regulations in Washington on July 13.

This statement is without merit, as per my prior post "Huffington Post Investigative Fund: FDA, Obama Digital Medical Records Team at Odds over Safety Oversight."


Further: note to Dr. Blumenthal: Consensus is not science:

[Chrichton, Caltech Michelin Lecture, 2003] ... I want to pause here and talk about this notion of consensus, and the rise of what has been called consensus science. I regard consensus science as an extremely pernicious development that ought to be stopped cold in its tracks. Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled. Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you're being had.

Let's be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world.

In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus. There is no such thing as consensus science. If it's consensus, it isn't science. If it's science, it isn't consensus. Period

Blumenthal goes on:

In public remarks that day, Blumenthal said he “expects” an eventual certification process for the digital systems to “collect information about the problems that occur with the implementation of electronic health records, if any.” He did not say when that would happen.

Eventual? Why not NOW, considering that the HIT industry has been in business for decades? The infrastructure for such reporting already exists, such as the FDA MedWatch system and Manufacturer and User Facility Device Experience (MAUDE) database - where, by the way, an HIT-related death was reported (link).

Imagine the aviation, nuclear energy industry or pharma making such outrageous statements about getting around to collecting information on potentially hazardous flaws "eventually."

In a later interview, Blumenthal said “safety concerns are not being ignored,” but wouldn’t comment further.

Yes, they are being ignored. Worse, they're being suppressed as in this case example, "A Lawsuit Over Healthcare IT Whistleblowing and Wrongful Discharge: Malin v. Siemens Healthcare."

... Dozens of other health information technology insiders, from academics to front-line users who believe digital medical records can promote better and cheaper health care, told the Investigative Fund in interviews that they nonetheless fear safety issues will mount as doctors and hospitals move quickly to install the systems and collect stimulus checks.

Just anecdotal, according to ONC and others.

“People just assume that computers will make things safer,” said Nancy Leveson, a safety engineering expert at Massachusetts Institute of Technology. “While they can be designed to eliminate certain kinds of hazards, they increase others and sometimes they introduce new types of hazards.”

This is a principle direct from the discipline of Social Informatics. Social Informatics (SI) refers to the body of research and study (e.g,, as collected here) that examines social aspects of computerization, including the roles of information technology in social and organizational change, the uses of information technologies in social contexts, and the ways that the social organization of information technologies is influenced by social forces and social practices.

Some experts are calling for closer government monitoring of the systems to protect the public. “We need to have some scrutiny at the front end and have an approval process to make sure they are safe before they’re deployed,” said Sharona Hoffman, a law professor at Case Western Reserve University, who has written about the issue in academic journals.

I am one of those experts, and I agree.

... In 2004, digital record keeping got a boost when President George Bush signed an executive order to create a digital medical file for every American within a decade, a goal officials said at the time they could reach “without substantial regulation.”

“The time wasn’t right at that time to move forward or the support wasn’t there (for safety regulations),” said Robert Kolodner, who ran the national coordinator’s office during some of the Bush years.

Again, I ask - why the hell not? When is a good time to discuss healthcare and HIT safety? Would such attitudes be tolerated in other industries? I dare say, hell no.

Edward H. Shortliffe, president of the American Medical Informatics Association and a longtime industry figure, agreed that safety issues weren’t a “primary concern” as tech companies began to expand their offerings.

That is a startling revelation - it could be the basis for medical malpractice plaintiff lawsuits on the basis of negligence, all the way to criminally negligent homicide if a patient dies of an HIT-related event.

Criminal negligence: The failure to use reasonable care to avoid consequences that threaten or harm the safety of the public and that are the foreseeable outcome of acting in a particular manner.

The HuffPo article further observes:

Earlier this year, the [health IT] trade group convened an expert panel to study the issues for the first time, but its findings have yet to be made public. Shortliffe said he didn’t think the organization would take a stand on government regulation of the industry, but said: “We recognize that there are significant challenges that the field as a whole is facing.”

The "first time" in an industry messing with people's medical care for decades? How is this possible?

... ONC director Blumenthal, the point man for the administration, has called the FDA’s injury findings “anecdotal and fragmentary.”

Just how many "anecdotes" do we need in an industry that places gag clauses on health IT users?

[March 2011 addendum - see thoughts on health IT "anecdotes" at this posting: Those Who Dismiss Healthcare (and Healthcare IT) Adverse Events Reports as Mere "Anecdotes" Have Lost, Supreme Court-Style - ed.]

He told the Investigative Fund that he believed nothing in the report indicated a need for regulation [i.e., absence of evidence in a tight-lipped industry as evidence of absence - ed.] Yet others see anecdotes as a starting point for a more methodical look at problems that arise.

Who is more attuned to risk mitigation when they see red flags? Who are the clinician scientists, and who are the "see no evil, hear no evil, speak no evil" politicians?

The same day that Blumenthal, Sebelius and other federal health officials unveiled their digital records plan in Washington, an obscure government agency held a conference less than 20 miles away in suburban Maryland to discuss the state of quality controls.

Ben-Tzion Karsh, an engineering professor at the University of Wisconsin in Madison who attended the National Institute of Standards and Technology conference said he heard a “broad consensus” among experts that electronic medical records need to function better and safer. “The truth is that we do not at this time know what would make an EHR (electronic health record) safe,” he said.

Others said that despite the rosy view taken by many political figures in Washington, many systems on the market today aren’t designed in ways that prevent and limit new errors—and that nobody is holding the industry accountable.


Again I ask: how is this possible after decades of this industry's selling of products to hospitals for use on actual patients?

Also, the simple question arises: is this technology truly ready for the ambitious national roll out plans of the past two administrations?

I've touched on many of these issues in past years on this blog and at my academic website on HIT problems.

I see a clear runaway train and train wreck headed down the tracks.

The next few years in health IT should prove interesting indeed.

-- SS

2 comments:

  1. Trinity shut down the system with or without warning anybody to scurry for the paper or the the system crashed down unexpectedly? There is a difference. The double talk by the hospital company is evident.

    ReplyDelete
  2. It hurts me to see the suffering that has been caused by ill conceived alterations of medical care promoted by government under the influence of profiteers.

    ReplyDelete