Monday, September 06, 2010

Health IT: "Danger"

In an article simply entitled "Danger", Health Data Management author Elizabeth Gardner spells out some "inconvenient truths" about health IT.

Many of the contributors have opined on the risks associated with health IT; several are newly contrite about this issue:

Danger
By Elizabeth Gardner
Health Data Management Magazine, 08/01/2010

Even to Ross Koppel, electronic health records are better than paper ones, "or cuneiform tablets, smoke signals, or carrier pigeons," he adds. He prefers to use hospitals and doctors that have EHRs.

But the University of Pennsylvania sociologist specializes in analyzing interactions between medical computer systems and the people who use them, and he's found enough problems to turn him into an industry gadfly on the potential dangers of EHRs.

"A resident will get an alert at 50 [milligrams of a certain drug] at one hospital, 60 at a second hospital, and no alert at a third hospital because they turned it off," Koppel says. "So he thinks the 70 milligrams he's ordered there are safe. The residents don't know whether the alerts are on or off. They're not familiar with many medications and they start a new rotation every thirty days. They use these alerts as safety bumpers and that's not safe."

... Koppel has found plenty of other glitches, from outright programming errors to user interfaces that make life difficult for clinicians. Numerical values appear in an order that makes sense to the computer but looks random to a human; positive test results aren't always flagged for review; weights aren't consistently labeled as pounds or kilograms (which can lead to babies, for example, being given twice, or half, the medication they need).

"Everyone focuses on why physicians are resistant to computers, but I would rather focus on how difficult the systems are to use," Koppel says. "Most physicians are the smartest guys in the room. Their resistance to technology as such is zero, but they resist software that has a clunky structure."

Dr. Koppel, a sociologist, is a well recognized name in health IT critical thinking circles. He will not win any favors from the health IT vendors and CIO's for that comment about physicians and smartness, but he is quite correct. Physicians are not Luddites. They readily adapt new technology proven as good for patients.

In fact, in my observations IT personnel are the true Luddites, clinging to inappropriate, rigid business-IT views on the healthcare IT development and implementation process (vs. more appropriate and modern agile methodologies), holding unshakable, stereotypical views about physicians, and remaining unreasonably obstinate on clinician complaints about "clunky" health IT user experiences.

Every year the ECRI Institute, Plymouth Meeting, Pa., a not-for-profit organization that evaluates health technology, issues a top 10 list of technology hazards in medical care. "Problems with computerized equipment and systems" ranked seventh this year, right behind "needlesticks and other sharps injuries" and ahead of "surgical stapler hazards." Most of the incidents reported to ECRI by its 5,000 members (hospitals, health systems, payers, and other interested parties) were due to convergence of computers and medical devices in areas like medication management and the routing of device alarms to clinicians' cell phones and pagers. (ECRI Institute points out that such problems are "most certainly underreported.")

Under-reporting of health IT hazards is a familiar theme to this author, as in a 2009 paper entitled "Remediating an Unintended Consequence of Healthcare IT: A Dearth of Data on Unintended Consequences of Healthcare IT" that was not published due to first-round reviews. Some of those reviews were legitimate and constructive regarding revision, but others appeared to suggest the topic was verboten (for example, "nothing is in this paper that could not be read in any big city newspaper", rather the oxymoron considering the paper's topic). I did not bother with a revision, simply making the paper public and bypassing the peer review censorship I saw coming.

But EHRs can easily cause errors, too. Plenty of experts believe that too many systems are being installed too fast into environments too complex to be easily computerized. In the frenzy to be eligible for federal EHR meaningful use incentive payments, and avoid reimbursement penalties starting in 2015, institutions may be setting themselves up for disastrous computer-induced medical errors.

The theme of "too fast" has been present in my writings for awhile now; see for example my 2008 posts "Should The U.S. Call A Moratorium On Ambitious National Electronic Health Records Plans?" and "Open Letter to President Barack Obama on Healthcare Information Technology".

"I'm one of the biggest believers [in EHRs], but there's tremendous pressure to implement these systems so fast," says medical informaticist Dean Sittig, associate professor at the University of Texas Health Science Center at Houston and a leading researcher on successes and failures of EHR implementations. "It worries me that people won't have adequate time to come to grips with what they're doing and test their systems properly."

Dr. Sittig did write some excellent, pioneering articles on the indispensibility of the CMIO role ("information architect" - PDF) back in the mid-1990's that I use in my teaching. However, a few years ago he told a former student of mine (he was unaware of that relationship) who listed me as a reference on her CV to not do so, as I was not a "real medical informaticist" or words to that effect. Seems at the time he may have taken issue with my realist views on the problems with health IT.

The medical environment is more complex than other fields like aircraft navigation, which is already hard enough to computerize, notes Nancy Leveson, professor of aeronautics and astronautics at the Massachusetts Institute of Technology. She's a pioneer in software safety ... "We're talking about a professional environment of doctors, and changing the way they do business," Leveson says. "Most other kinds of automation aren't doing that.

This issue is critical and at the root of health IT dysfunction. No other profession is being asked to use IT in the manner in which clinicians have been asked. No other profession may have an information model as complex that they're asked to record in painstaking, granular detail, either.

Because software engineers aren't taught about usability and the impact of their systems on the world, they think they'll just automate it the way they want and make people do it their way. There's a lot of stuff out there [in health care] that's very difficult to use. The industry is naive about introducing software and the change it requires and the potential hazards it introduces, and they think it's going to be all right."

I would replace the word "naive" with "willfully ignorant, complacent and negligent." There are simply no acceptable excuses for a field that has been in existence for decades to be as toddlers on the impact of their work.

... Clinicians who already have extensive experience with EHRs are under no illusions that everything works smoothly. "It's inevitable that some new errors will be introduced [with EHRs]," says David Bates, M.D., chief of general medicine at Brigham and Women's Hospital, Boston, a patient safety expert, and a member of the information technology executive committee of Partners HealthCare, Brigham's parent.

Considering the ice-cold reception by most of those clinician experts (e.g., in the medical informatics community) to my writings on HIT problems that began in 1999 and now reside at my Drexel site here, and to the similar writings of others, I disagree with Bates' assessment. I think the experts have now painted themselves into a corner and have been forced by reality and their own willful blindness into admitting the truth about this experimental technology. (A PubMed search on "Bates DW" is not generally revealing of papers on health IT risks until relatively recently.)

"The key thing is to devote enough resources and attention to fixing them [errors] after they happen. Your EHR may prevent 10 errors for every new one it causes [not sure where these figures come from or if they're generalizable at all - ed.], but you have to have an approach for dealing with the new ones."

Again, I would differ. The key thing is to prevent them from happening as much as possible. The pharma and medical device industries do this through RCT's, post market studies, and strong regulatory requirements for their products' manufacture and use.

Bates is not an advocate of regulation of health IT. As in my post "JAMA letter: Health Care Information Technology, Hospital Responsibilities, and Joint Commission Standards", in 2009 Bates signed on to an unpublished letter to JAMA (archived here) in response to Koppel and Kreda's article on "hold harmless" clauses that stated "... the belief that the best approach to increase the safety and effectiveness of EHR systems is by legal regulation of system vendors is misplaced." (Incidentally, the letter that JAMA did publish was mine.)

... Partners has been developing its own in-house EHR for several decades, and still encounters things that need fixing. For example, when physician order entry was introduced, the system made it possible to order fatally large doses of intravenous potassium, because the initial order choices hadn't been properly vetted by the internal team responsible. The nurses who executed the orders knew enough to catch the error before it harmed a patient, but Bates says it took a year and a half to get the screen corrected in the order entry system.

I can add that taking a year and a half to correct a potential fatality-causing screen in a home-built clinical IT application is not just a technological issue, but also an organizational, political and leadership issue.

... "Emergency care is by definition nonlinear and unpredictable, and information technology tends to enforce a certain amount of linearity: you can't do step 5 until you've done steps 1 through 4," says Wears. Emergency room personnel often care for multiple patients simultaneously and have to do things "out of order" so they can be more efficient. If an EHR enforces a linear pattern, it will just get in their way, and they'll compensate by running a parallel system on paper and putting information into the EHR later. "That subverts one of the fundamental things you wanted the system to do-give you real-time feedback on good and bad ideas," Wears says.

The realities of the ED cannot be neatly dealt with as if a calm, solitary office environment .

There's plenty of blame to go around. Koppel and Leveson say software design, or lack of it, is a common culprit, and they take vendors to task for not focusing on usability.

Again, there is no reasonable excuse for IT intractability in a mission critical sector.

Leveson says part of the problem is a lack of regulatory standards. "The FDA doesn't want to oversee anything and that's a mistake," she says. "So it's become this free-for-all in the industry."

I've used that same language. In my 2009 post E-Health Hazards: Provider Liability and Electronic Health Record Systems I wrote "the unregulated free-for-all that has been the health IT marketplace, with dangerous and even outrageous practices I noted starting a decade ago, must come to an end as the market matures and as diffusion of this technology massively increases per the government mandates now in effect."

Though experts say EHRs clearly fall under the category of medical devices, the FDA has steered clear of directly regulating them. Its ill-starred policy of requiring pre-market approval for blood-bank software, begun back in the 1990s, resulted in major vendors pulling out of the market altogether and stifling innovation ... "That kind of regulation provides a huge economic disincentive and there hasn't been any substantial improvement in blood bank software in 10 years," says Geisinger's Walker. "If the FDA required it for EHRs, it would harm patients more than help."

I disagree with Walker and especially disagree with the latter statement, based on one anecdotal case of IT regulation (ironically, it's HIT proponents who most often claim that 'anecdotes' of patient harm don't make data).

Further, the FDA was called in initially because of IT dangers in existing blood banking software. Also, the "innovation" referred to could more accurately be referred to as "lifecycle adaptation and enhancement", not true innovation. In other words, there was not enough profit to be made in maintaining mature software under regulation. The effect might be to push the quick-buck profiteers out of the industry, and thus improve quality and innovation.

The Swedish Medical Products Agency is leading the way for consideration of HIT in the EU as a medical device to be regulated as in my post "Improving Patient Safety In The EU: HIT Should Be Classified As Medical Devices". Yet, that didn't seem to impact Swedish innovation; in the UK, Wrightington, Wigan and Leigh NHS Foundation Trust has recently awarded its hospital information system contract to Swedish healthcare systems provider, Cambio (link).

Albeit in another anecdote, FDA's regulation did not harm pharma IT as far as I could tell, while a Group Director in Merck Research Labs' Research Information Systems Division.

Truth is, there is no good data one way or the other regarding regulation, but after thirty or more years of an unfettered HIT industry, I believe it reasonable to say that the presence of harmful HIT speaks more for regulation than for a continued industry "free-for-all."

The Agency for Healthcare Research and Quality is currently working with the FDA, VA and other federal agencies to develop a common format for reporting I.T.-related patient safety events and unsafe conditions.

"What took so long" is my question.

Read the whole article. The myth of health IT beneficence continues to be eroded. One can only hope the tens of billions earmarked for the technology in the recent economic "recovery" legislation will become similarly eroded in years to come, to allow the technology to be safely improved - that is, in vitro, not in vivo.

-- SS

8 comments:

Anonymous said...

Dr. Bates, unabashed HIT zealot, has had an epiphany: "...the system made it possible to order fatally large doses of intravenous potassium, because the initial order choices hadn't been properly vetted by the internal team responsible. The nurses who executed the orders knew enough to catch the error before it harmed a patient..."

Dr. Bates, are you sure this defective CPOE never took a patient down? How then would you have known about it?; and what if an agency nurse did not know enough to catch the error?

InformaticsMD said...

Anonymous, these are good questions.

-- SS

Live it or live with it said...

Newsflash: In big company HIT, the IT person is normally not the smartest person in the room. The best programmers and analysts generally aren't interacting with the end users because they are too busy and valuable to the company in writing code.

In fact the best often leave the HIT company and strike out on their own.

InformaticsMD said...

Live it or live with it said...

The best programmers and analysts generally aren't interacting with the end users because they are too busy and valuable to the company in writing code.

I agree.

The best architects are much more important than carpenters and plumbers when designing a new edifice, yet in the some industries it seems the management doesn't understand this.

I would really like health IT to have a chance of helping the medical field, yet the industry in its present form seems headed in the wrong direction.

Anonymous said...

As a former staff at Geisinger, Walker's home turf, I found that the admin was aggressive at retaliation to protect its investment and reputation in HIT. They went after people without abandon for supposed HIPAA violations if they crossed the admin. Walker, to where did you report adverse events caused by HIT?

InformaticsMD said...

Anonymous said...

As a former staff at Geisinger, Walker's home turf, I found that the admin was aggressive at retaliation to protect its investment and reputation in HIT

I don't know if this is true or not, but it certainly is believable from my own experiences since entering this field 20 years ago. The territoriality of IT is immense; HIT combines the territoriality of IT with the territoriality of medicine for a combined effect I find atrocious ("blood for computers" is an apt description).

I hope what anonymous wrote is not true. As the "protected" status of health IT ends (and it will end), I believe charges beyond med mal, including criminal negligence in cases of patient injury and death due to EMR-caused errors will appear.

These charges will likely not just be against the clinicians but will include executives as well responsible for these systems.

HIPAA-Run said...

HIPAA is most certainly used as a silencing tool, there is no part of the HIPAA code that protects against this, so why wouldnt those at the hospital use it to silence critics??

They wouldnt use HIPAA to silence critics because these administrators and lawyers are just good people who would never ever do such a thing and these is no reason to suspect it or even look into it (of course).

Conversly, why would the poor whistleblower violate HIPAA, to punish the employer?

Why attribute bad intentions to the whistleblower and not the people who have something to gain and a natural conflict of interest?

Anonymous said...

I have watched nurses, PAs, MAs, and MDs enter information into Centricity, Epic, and Star. I am truly appalled at how poorly the interfaces are designed. I am not a programmer, but it's painfully obvious that these interfaces were designed with little to no clinician input.

No wonder healthcare professionals don't want to use these systems. The design philosophy seems much like SAP's: "you will change your business processes to fit the way our system works."

I've asked various clinical staff while watching them struggle with these horrible systems that various functions aren't used because of lack of training, too difficult to use, takes too much time, etc. Makes me wish I were a programmer and/or usability tester. IT people like me don't use these systems - it's incumbent on the manufacturers to have users actively involved during the design, build, testing, implementation, and upgrade phases. This is a fundamental philosophy which vendors may have discarded in favor of profit.

Medical professionals should not be getting the Microsoft/Apple treatment: "You find the bugs, maybe we'll fix them" - especially when lives and health are at stake.

IT staff at medical facilities may be responsible for installing the hardware and software, support, and similar duties. Neither they, nor medical facility accounting and administrative personnel should be a major part of the selection process. The medical professionals who will have to live with the software should be the largest group represented in these decisions.

I had a six-month engagement at a community hospital some years ago. The various departments would go buy various systems and advise the IS department after the fact. Never mind if the systems were difficult to interface with the hospital billing system, that's what they wanted, and that's what they were getting. IT staff can be territorial, as can other departments' staffs. It would be better if all realized the common goal was good patient care, and worked together accordingly.