Showing posts with label Cybernetik Über Alles. Show all posts
Showing posts with label Cybernetik Über Alles. Show all posts

Saturday, January 05, 2013

ONC and "Health IT Patient Safety Action & Surveillance Plan": When Sociologists Uphold the Hippocratic Oath While Physicians Pay Respect to the Lords of Kobol, We Are in a Dark Place, Ethically

[Note: this essay contains many hyperlinks. They can be right-clicked and opened in a separate tab or window.]

I've been meaning to write more on the just-before-Christmas, Friday afternoon, minimal-visibility release of the ONC report I'd written about in my Dec. 23, 2012 post "ONC's Christmas Confessional on Health IT Safety: HIT Patient Safety Action & Surveillance Plan for Public Comment."   (The ONC report itself is available at this link in PDF.)

The Boston Globe and Globe staff writer Chelsea Conaboy, however, have beaten me to the punch in the Jan. 3, 2013 article "Federal government releases patient safety plan for electronic health records", link below.

('Lords of Kobol', of course, is a pun.  They were fictional gods in a sci-fi series from the 1970's and a remake a few years ago, but in my circles the term is used satirically and derisively to reflect people expressing inappropriate overconfidence in - and perhaps worship of - computers.   Cobol, the COmmon Business-Oriented Language, is one of the oldest programming languages and was the major programming language of the merchant computing sector, including business, finance, and administrative systems for companies and governments.)

First, I do want to reiterate what I'd mentioned in my earlier post:  the new ONC report is a sign of progress, in terms of a government body explicitly recognizing the social responsibilities incurred by conducting the mass human subjects experiment of national health IT.  However, I also wrote:

... [The ONC report] is still a bit weak in acknowledging the likely magnitude of under-reporting of medical errors, including HIT-related, in the available data, and the issue of risk vs. 'confirmed body counts' as I wrote at my recent post "A Significant Additional Observation on the PA Patient Safety Authority Report -- Risk".

The Globe quoted a number of people involved the health IT debate, and I am now commenting on their Jan. 3 article:

Federal government releases patient safety plan for electronic health records
Boston Globe
01/03/2013 11:16 AM   

By Chelsea Conaboy, Globe Staff

The federal office in charge of a massive rollout of electronic health records has issued a plan aimed at making those systems safer by encouraging providers to report problems to patient safety organizations.

Though some in the field say it doesn’t go far enough, others said the plan is an important step for an office whose primary role has been cheerleader for a technology that has the potential to dramatically improve health care in the United States but that may come with significant risks.

A major issue at the heart of the controversy is the fact that, admittedly, nobody knows the magnitude of the risks - in large part due to systematic impediments to knowing.  This has been admitted by organizations including the Joint Commission (link), U.S. FDA (link; albeit in an "internal memo" never intended for public view, and discovered only through the hard work of Center for Public Integrity investigative reporter Fred Schulte when he was at the Huffington Post Investigative Fund), Institute of Medicine of the U.S. National Academies (link, quoted at midsection of post), and others. 

I have made the claim that when you don't know the level of harm of an intervention in healthcare, and there are risk management-relevant case reports of dangers, you don't go gung-ho and start a national-scale implementation with penalties for non-adopters, and then decide to study safety, quality, usability etc.  You determine safety first in more controllable and constrained environments.  Anything else is, as I wrote, putting the cart before the horse (link).


Things are a bit out of order here.


You also certainly don't dismiss risk management-relevant case reports from credible observers as "anecdotal", the common refrain of hyperenthusiasts and (incompetent) scientists who conflate scientific research with risk management - as a researcher from Down Under eloquently observed in the Aug. 2011 guest post "From a Senior Clinician Down Under: Anecdotes and Medicine, We are Actually Talking About Two Different Things."

 Back to the Globe:

A year ago, the Institute of Medicine issued a report urging the federal government to do more to ensure the safety of electronic health records. It highlighted instances in which the systems were linked to patient injury, deaths, or other unsafe conditions.

The report suggested creating an independent body to investigate problems with electronic records and to recommend fixes, similar to how the National Transportation Safety Board investigates aviation accidents.

Instead, the Office of the National Coordinator for Health Information Technology delegated various monitoring and data collection duties to existing federal offices, including the Agency for Healthcare Research and Quality [AHRQ].

The problem is that AHRQ is a research agency (as its name suggests), has no regulatory authority nor any experience in regulation, and most clinicians have never heard of it.  In effect, this ONC recommendation is lacking teeth, even compared to the relatively milquetoast recommendations of IOM itself (as I wrote about in a Nov. 2011 post "IOM Report - 'Health IT and Patient Safety: Building Safer Systems for Better Care' - Nix the FDA; Create a New Toothless Agency").


The [ONC] office has asked patient safety organizations, which work with doctors and hospitals to monitor and analyze medical errors, to add health IT to their agendas. Data from the organizations would be aggregated by the agency, but reporting by doctors and hospitals is completely voluntary.  [A prime example of what I term an extraordinary regulatory accommodation afforded the health IT industry - ed.]

Now we're into septic shock blood pressure-level weakness. Here is PA Patient Safety Authority Board Member Cliff Rieders, Esq. on mandatory, let alone voluntary reporting. From “Hospitals Are Not Reporting Errors as Required by Law", Philadelphia Inquirer, pg. 4, http://articles.philly.com/2008-09-12/news/24991423_1_report-medical-mistakes-new-jersey-hospital-association-medication-safety:
  


... Hospitals don’t report serious events if patients have been warned of the possibility of them in consent forms, said Clifford Rieders, a trial lawyer and member of the Patient Safety Authority’s board.

He said he thought one reason many hospitals don’t want to report serious events is that the law also requires that patients be informed in writing within a week of such problems. So, if a hospital doesn’t report a problem, it doesn’t have to send the patient that letter. [Thus reducing risk of litigation, and, incidentally, potentially infringing on patients' rights to legal recourse - ed.]


Rieders says the agency has allowed hospitals to determine for themselves what constitutes a serious event and the agency has failed to come up with a solid definition in six years.

Fixing this “is not a priority,” he added.

To expect hospitals to voluntarily report even a relevant fraction of mistakes and near-misses out of pure altruism, or permit their clinicians to do so, with the inherent risks to organizational interests such reporting entails, is risible.

The near-lack of reporting by most health IT sellers and hospitals in the already-existing FDA Manufacturer and User Facility Device Experience (MAUDE) database is substantial confirmation of that; the fraction of reports in MAUDE, however, are hair-raising.  See my Jan. 2011 post "MAUDE and HIT Risks: What in God's Name is Going on Here?" for more on that issue.

Here's an example of what happens to 'whistleblowers', even those responsible for system development and safety: "A Lawsuit Over Healthcare IT Whistleblowing and Wrongful Discharge."

ONC's recommendations thus in my opinion reflect bureaucratic window dressing, designed to create progress - but progress that can probably be measured in microns.

“There was no evidence that a mandatory program was necessary,” Jodi Daniel, the [ONC] office’s director of policy and planning, said in an interview.

Really?  See the aforementioned Philadelphia Inquirer article Hospitals Are Not Reporting Errors as Required by Law", as well as numerous articles on pharma and medical device industry reporting deficits such as starting at page 5 in my paper "A Medical Informatics Grand Challenge: the EMR and Post-Marketing Drug Surveillance" at this link in PDF.

There is no evidence mandatory reporting is necessary ... to someone who's either naïve, incompetent - or persuaded, e.g. with money, to not find evidence or rationale.

The [ONC] office has been under pressure to roll out the electronic health records systems quickly while protecting patient data and making sure that the systems don’t cause problems in medical care, said Dr. John Halamka, chief information officer at Beth Israel Deaconess Medical Center. 

Under pressure by the health IT lobby, perhaps; but nobody else that I can think of.

“It’s this challenging chicken-and-egg problem,” he said.

No, actually, it isn't.  Patient safety must come first. This becomes clear when one considers the late 5th century BC ethical principle Primum non nocere ("first, do no harm" or "abstain from doing harm") versus the late 20th and early 21st century IT-hyperenthusiast credo I've expressed as "Cybernetik Über Alles"  ("Computers above all"). Under CÜA, the computer has more rights than the patients, and the IT industry receives extraordinary regulatory accommodation to sloppy practices that no other healthcare or mission-critical non-healthcare sector enjoys.

I sent Dr. Halamka a set of arguments such as I make here, and a picture of a health IT 'chicken', my deceased mother in her death robes.

I received back a "thank you for the views" message - but no condolences.  (It occurs that I have rarely if ever received condolences from any senior HIT-hyperenthusiast Medical Informatics academic or government official to whom I've mentioned my mother.  Not to play amateur psychologist, but I believe it reflects the level of disdain or even hatred felt by these people towards health IT iconoclasts/patient's rights advocates.)

The plan, which is subject to public comment through Feb. 4, “is a reasonable start,” in part because it puts more pressure on hospitals and doctors to monitor safety, Halamka said.

As I expressed to Dr. Halamka, we are in agreement on that point.

The government would have risked stifling innovation in the industry if it had opted instead to require the kinds of tests and review by the Food and Drug Administration that new medical devices and drugs must go through, he said.

To that, I mention here (as I did in my email to him) my response to this industry meme, as I had expressed it at Q&A after my August 2012 keynote address to the Health Informatics Society of Australia:

... I had a question from the audience [after my talk], from fellow blogger Matthew Holt of the Health Care Blog.  (I've had some online debate with him before, such as in the comment thread at my April 2012 post here.)

Matthew asked me a somewhat hostile question (perhaps in retaliation for the thrashing he received at the end of my May 2009 post on the WaPo's HIT Lobby article here), that I was well prepared for, expecting a question along these lines from the seller community, actually.  The question was preceded by a bit of a soliloquy of the "You're trying to stop innovation through regulation" type, with a tad of Merck/VIOXX ad hominem thrown in (I ran Merck Research Labs' Biomedical libraries and IT group in 2000-2003).

His question was along the lines of - you were at Merck; VIOXX was bad; health IT allowed discovery of the VIOXX problem by Kaiser several years before anyone else; you're trying to halt IT innovation via demanding regulation of the technology thus harming such capabilities and other innovations.

The audience was visibly unsettled.  Someone even hollered out their disapproval of the question.

My response was along the lines that:

  • VIOXX was certainly not Merck at its best, but regulation didn't stop Merck from "revolutionizing" asthma and osteoporosis via Singulair and Fosamax;
  • That I'm certainly not against innovation; I'm highly pro-innovation;
  • That our definitions of "innovation" in medicine might differ, in that innovation without adherence to medical ethics is not really innovation.  It is exploitation.

I stand by that assessment.

More from the Globe article:

There is little good research into how the systems improve health care and there are big obstacles to fixing even the known problems, said Ross Koppel, a professor of sociology at the University of Pennsylvania who studies hospital culture and medication errors.

Some developers require providers to sign nondisclosure agreements before using their systems, and the safety plan does not prohibit such gag clauses.  [Note: I wrote on this issue here, and in a published July 2009 JAMA letter to the editor "Health Care Information Technology, Hospital Responsibilities, and Joint Commission Standards" here - ed.]  While the plan addresses reporting of known problems, Koppel said it will not help researchers and developers understand problems that go unnoticed but that may be causing real patient harm. 

“We only know the tip of the iceberg” about how electronic health records affect patient care, said Koppel, who was an official reviewer for the Institute of Medicine report.

As per the title of this blog post, we are in a dark place, ethically, when a PhD sociologist who's never taken the Oath of Hippocrates (to my knowledge) appears to express more concern for patient safety and patient's rights than a Harvard physician-informatics Key Opinion Leader such as Dr. Halamka.

Koppel said the mantra of the Office of the National Coordinator has been that more health IT leads to better health care. “It probably is better than paper,” he said, “but it could be so much better than it is.”

I agree, but with caveats.  I opine that bad health IT is likely worse for patients than a good, well-staffed paper based system.  For instance, the former can cause systematic dangers that even a bad paper system cannot, such as tens of thousands of prescription errors (see my Nov, 2011 post "Lifespan Rhode Island: Yet another health IT 'glitch' affecting thousands - that, of course, caused no patient harm that they know of - yet") or mass privacy breaches (see the current 30 or so posts on that issue at this blog query link: http://hcrenewal.blogspot.com/search/label/medical record privacy).

On good health IT and bad health IT from my teaching site "Contemporary Issues in Medical Informatics: Good Health IT, Bad Health IT, and Common Examples of Healthcare IT Difficulties" at http://www.ischool.drexel.edu/faculty/ssilverstein/cases/:

Good Health IT ("GHIT") is IT that provides a good user experience, enhances cognitive function, puts essential information as effortlessly as possible into the physician’s hands, keeps eHealth information secure, protects patient privacy and facilitates better practice of medicine and better outcomes. 

Bad Health IT ("BHIT") is IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation.
 

The Boston Globe article concludes:

Ashish Jha, associate professor of health policy at Harvard School of Public Health and a member of the panel that drafted the Institute of Medicine report, said he wants doctors to be able to report problems -- errors in medication lists, for example -- in real-time so they can be found and fixed quickly. The safety plan does not require systems to have that capability, but Daniel said her office could soon add such a requirement for products that receive federal certification.

The bigger problem is that health care as a whole needs a better way of tracking patient safety, Jha said. Monitoring issues caused by electronic health records “should be a part of it, and then we can actually know if this is a small, medium or large contributor to patient safety issues,” he said. “But we don’t know that.”

I agree with Dr. Jha, but the IT sellers and healthcare organizations will (legitimately) claim that adding real-time error reporting/forwarding to their products will be extremely resource-intensive.

I have an alternate approach that will require little effort on the part of the sellers and user organizations.

  • Post a message at the sign-in screen of all health IT along the lines that "This technology is experimental, adopted willingly by [organization] although not rigorously vetted for safety, reliability, usability, nor fitness for purpose, and thus you use it at your own risk.  If problems occur, report them to the following" ...

"The following" could include a list of alternatives such as I wrote in my Aug. 2012 post "Clinicians: How to Document the EHR Screens You Encounter That Cause Concern."


... When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.


See the actual post for an idea about clinicians seeking indemnification when forced by healthcare organizations to use bad health IT.  I can attest to actually seeing HIT policies that call for "human resources actions" if clinicians refuse to use HIT, or cannot learn to use it at a sufficient pace.

(Left out of this reiteration is the demonstration on photographing problematic EHR screens.  See the post for the details - it is easy to do, even with a commodity cellphone.)

HHS should be promoting laws on protection from retaliation upon clinicians reporting problems in good faith.

Thus, physicians, nurses and other clinicians can create needed health IT transparency and help our society discover the true level of risks of bad health IT.  They simply need the right information on what to do and where to report, bypassing the ONC office and, in the spirit of medicine, taking such matters into their own hands in the interests of patient care and medical ethics.

I also made recommendations to the Pennsylvania Patient Safety Authority on how known taxonomies of health IT-related medical error can be used, and need to be used, to promote error reporting in common formats.  Slides from my presentation to the Authority entitled "Asking the Right Questions:  Using Known HIT Safety Issues to Improve Risk Reporting and Analysis", given in July 2012 at their invitation, are at http://www.ischool.drexel.edu/faculty/ssilverstein/PA_patient_safety_Jul2012.ppt

Finally, another sign of progress:  unlike the HITECH Act, this new ONC plan is open to public comment.

-- SS

Addendum Jan. 8., 2012:

Dr. Halamka has put more details regarding his views in his blog.  The entry is entitled "Electronic Health Record Safety" at this link:  http://geekdoctor.blogspot.com/2013/01/electronic-health-record-safety.html .

He writes:

... Some have questioned the wisdom of moving forward with EHRs before we are confident that they are 100% safe and secure.   [That, of course, is not my argument - nothing is ever 100% safe and secure.  However, we don't yet know just how safe and secure - or unsafe and insecure - HIT is.  That is the issue I am concerned about - ed.] I believe we need to continue our current implementation efforts.

I realize it is a controversial statement for me to make, but let me use an analogy.

When cars were first invented, seat belts, air bags, and anti-lock brakes did not exist.    Manufacturers tried to create very functional cars, learned from experience how to make them better, then innovated to create new safety technologies. many of which are now required by regulation.

Writing regulation to require seat belts depended on experience with early cars.

My grandmother was killed by a medication error caused by lack of an EHR.  My mother was incapacitated by medication issues resulting from lack of health information exchange between professionals and hospitals.   My wife experienced disconnected cancer care because of the lack of incentives to share information.     Meaningful Use Stage 2 requires the functionality in EHRs which could have prevented all three events.

I express my condolences on those events.

I disagree, however, with continuing national implementation efforts at the current rate, with penalties for non-adopters.  I opine from the perspective of believing health IT has not reached a stage where it is ready for national rollout and remains experimental, its magnitude of harms admittedly unknown and information flows systematically impaired.  I recommend and prefer great caution under those circumstances, and remediation of those circumstances before full-bore national implementation.

I will leave it to the reader ponder the two views.

-- SS

Thursday, November 29, 2012

Cybernetik Über Alles Again: HHS and Sebelius - Hospitals And Their Computers Have More Rights Than Patients

A Nov. 29, 2012 New York Times article by Reed Abelson entitled "Medicare Is Faulted on Shift to Electronic Records" observes that:

The conversion to electronic medical records — a critical piece of the Obama administration’s plan for health care reform — is “vulnerable” to fraud and abuse because of the failure of Medicare officials to develop appropriate safeguards, according to a sharply critical report to be issued Thursday by federal investigators [the report from HHS OIG is here - ed.] ... Medicare, which is charged with managing the incentive program that encourages the adoption of electronic records, has failed to put in place adequate safeguards to ensure that information being provided by hospitals and doctors about their electronic records systems is accurate. To qualify for the incentive payments, doctors and hospitals must demonstrate that the systems lead to better patient care, meeting a so-called meaningful use standard by, for example, checking for harmful drug interactions. [I note that meeting EHR "meaningful use" standards does not necessarily signify better care; the "standards" are experimental - ed.]

Hospitals and doctors are lying about their EHR efforts, in order to gain incentive payments, it seems.

In an article "IG says program is 'vulnerable' to abuse, better oversight needed", Fred Schulte at the Center for Public Integrity notes:

... the Centers for Medicare and Medicaid Services has since paid out more than $3.6 billion to medical professionals who made the switch without verifying they are meeting the required quality goals, according to a new federal audit to be released today

Observes the CEO of the American Health Information Management Association:

“We’ve gone from the horse and buggy to the Model T, and we don’t know the rules of the road. Now we’ve had a big car pileup,” said Lynne Thomas Gordon, the chief executive of the American Health Information Management Association, a trade group in Chicago. 

More Horse and Buggy than Model T.  At least the Model T was reasonably dependable. 

Also mentioned is this:

House Republicans echoed these concerns in early October in a letter to Kathleen Sebelius, secretary of health and human services. Citing the Times article, they called for suspending the incentive program until concerns about standardization had been resolved. “The top House policy makers on health care are concerned that H.H.S. is squandering taxpayer dollars by asking little of providers in return for incentive payments,” said a statement issued at the same time by the Republicans, who are likely to seize on the latest inspector general report as further evidence of lax oversight. Republicans have said they will continue to monitor the program.

In her letter in response, which has not been made public, Ms. Sebelius dismissed the idea of suspending the incentive program, arguing that it “would be profoundly unfair to the hospitals and eligible professionals that have invested billions of dollars and devoted countless hours of work to purchase and install systems and educate staff.”


I was taught "first, do no harm."  Fairness to patients injured and killed by this technology in its present "Horse and Buggy" state (buggy being a particularly apropos term) seems not a matter of particularly high concern to HHS.   A suspension of incentives would slow the adoption rate down, necessary in order to "get the bugs" out of the technology before mass deployment and develop safety, validation and surveillance standards (currently non-existent), as I wrote in my Oct. 24, 2012 "Letter To U.S. Senators and Representatives Who've Sought HHS Input On EHR Problems."

This is despite the fact that FDA, IOM and others have indicated the level of harm is not known, due to systematic impediments to diffusion of that knowledge (see IOM statements in the midsection of my post on health information technology hyper-enthusiasm at this link, and an internal FDA memo on HIT safety at this link). 

HHS seems to care not about health and human services, or at best to be severely misguided.  "Cybernetik Über Alles" seems their current credo.

-- SS

Sunday, September 30, 2012

UK: Another Example of IT Malpractice With Bad Health IT (BHIT) Affecting Thousands of Patients, But, As Always, Patient Care Was "Not Compromised"

At my Dec. 2011 post "IT Malpractice? Yet Another "Glitch" Affecting Thousands of Patients. Of Course, As Always, Patient Care Was "Not Compromised" and others, I noted:

... claims [in stories regarding health IT failure] that "no patients were harmed" ... are both misleading and irrelevant:

Such claims of 'massive EHR outage benevolence' are misleading, in that medical errors due to electronic outages might not appear for days or weeks after the outage ... Claims of 'massive EHR outage benevolence' are also irrelevant in that, even if there was no catastrophe directly coincident with the outage, their was greatly elevated risk. Sooner or later, such outages will maim and kill.

Here is a prime example of why I've opined at my Sept. 2012 post "Good Health IT (GHIT) v. Bad Health IT (BHIT): Paper is Better Than The Latter" that a good or even average paper-based medical record keeping system can facilitate safer and better provision of care than a system based on bad health IT (BHIT).

Try this with paper:

NHS 'cover-up' over lost cancer patient records

Thousands awaiting treatment were kept in the dark for five months when data disappeared

Sanchez Manning
The Independent
Sunday 30 September 2012

Britain's largest NHS trust took five months to tell patients it had mislaid medical records for thousands of people waiting for cancer tests and other urgent treatments. Imperial College Healthcare NHS Trust discovered in January that a serious computer problem and staff mistakes had played havoc with patient waiting lists.

It's quite likely the "serious computer problem" far outweighed the impact of "staff mistakes", as disappearing computer data does so in a "silent" manner.  One does not realize it's missing as there's not generally a trail of evidence that it's gone.

About 2,500 patients were forced to wait longer on the waiting lists than the NHS's targets, and the trust had no idea whether another 3,000 suspected cancer patients on the waiting list had been given potentially life-saving tests. Despite the fact that the trust discovered discrepancies in January and was forced to launch an internal review into the mess, including 74 cases where patients died, it did not tell GPs about the lost records until May.

That is, quite frankly, outrageous if true and (at least in the U.S.) might be considered criminally negligent (failure to use reasonable care to avoid consequences that threaten or harm the safety of the public and that are the foreseeable outcome of acting in a particular manner).

Revelations about the delay prompted a furious response yesterday from GPs, local authorities and patients' groups. Dr Tony Grewal, one of the GPs who had made referrals to Imperial, said doctors should have been told sooner to allow them to trace patients whose records were missing. "The trust should have contacted us as soon as it was recognised that patients with potentially serious illnesses had been failed by a system," he said. "GPs hold the ultimate responsibility for their patient care."

That is axiomatic.

The chief executive of the Patients Association, Katherine Murphy, added: "This is unacceptable for any patient who has had any investigation, but especially patients awaiting cancer results, where every day counts. The trust has a duty to contact GPs who referred the patients. It's unfair on the patients to have this stress and worry, and the trust should not have tried to hide the fact that they had lost these records. They should have let the GPs know at the outset."

Unfair to the patients is an understatement,  However, if one's attitude is that computers have more rights than patients, as many on the health IT sector seem to with their ignoring of patient rights such as informed consent, lack of safety regulation, and lack of accountability, then it's quite acceptable.

The trust defended the delay in alerting GPs, arguing that it needed to check accurately how much data it had lost before making the matter public. It said a clinical review had now concluded that no one died as a result of patients waiting longer for tests or care.

That would be perhaps OK if the subjects whose "data had been lost" through IT malpractice were lab rats.

Despite this, three London councils – Westminster, Kensington and Chelsea, and Hammersmith and Fulham – are deeply critical of the way the trust handled the data loss. Sarah Richardson, a Westminster councillor who heads the council's health scrutiny committee, said that trust bosses had attempted to "cover up" the extent of the debacle. "Yes, they've done what they can but, in doing so, [they] put the reputation of the trust first," she said. "Rather than share it with the GPs, patients and us, they thought how can we manage this information internally. They chose to consider their reputation over patient care."

As at my Oct. 2011 post "Cybernetik Über Alles: Computers Have More Rights Than Patients?", to be more specific, they may have put the reputation of the Trust's computers first. 

Last week, it was revealed that Imperial has been fined £1m by NHS North West London for the failures that led to patient data going missing. On Wednesday, an external review into the lost records said a "serious management failure" was to blame for the blunder.

Management of what, one might ask?

Imperial's chief financial officer, Bill Shields, admitted at a meeting with the councils that the letter could have been produced more quickly. He said that, at the time, the trust had operated with "antiquated computer systems" and had a "light-touch regime" on elective waiting times.

Version 2.0A will, as again is a typical refrain, fix all the problems.

Terry Hanafin, the leading management consultant who wrote the report, said the data problems went back to 2008 and had built up over almost four years until mid-2011. Mr Hanafin said the priorities of senior managers at that time were the casualty department and finance.

Clinical computing is not business computing, I state for the thousandth time.  When medical data is discovered "lost", the only response should be ... find it, or inform patients and clinicians - immediately.

He further concluded that while the delays in care turned out to be non-life threatening, they had the potential to cause pain, distress and, in the case of cancer patients, "more serious consequences" ... The trust said it had found no evidence of clinical harm and stressed that new systems have now been implemented to record patient data. It denied trying to cover up its mistakes or put its reputation before concerns for patients. "Patient safety is always our top priority," said a spokesman.

"More serious consequences" is a euphemism for horrible metastatic cancer and death, I might add.  The leaders simply cannot claim they "found no evidence of clinical harm" regarding delays in cancer diagnosis and treatment until time has passed, and followup studies performed on this group of patients.

This refrain is evidence these folks are either lying, CYA-style, or have no understanding of clinical medicine whatsoever - in which case their responsibilities over the clinic need to be ended in my opinion.

I, for one, would like to know the exact nature of the "computer problem", who was responsible, and if it was a software bug, how such software was validated and how it got into production.

-- SS

Oct. 1, 2012 Addendum:

What was behind the problems, according to another source?   

Bad Health IT (BHIT):

Poor IT behind Imperial cancer problems
e-Health Insider
28 September 2012
Rebecca Todd

An independent review of data quality issues affecting cancer patient referrals to Imperial College Healthcare NHS Trust has identified “poor computer systems” as a key cause of the problem.

The review’s report highlights the trust’s use of up to 17 different IT systems as causing problems for patient tracking.

However, it says the trust should be aware of the risks of [replacing the BHIT and] moving to a single system, Cerner Millennium, because of reported problems in providing performance data after similar moves at other London trusts.

In January 2012, the report says the NHS Intensive Support Team was reviewing the way reports on cancer waiting times were created from Imperial’s cancer IT system, Excelicare.

The team discovered that almost 3,000 patients were still on open pathways who should have been seen within two weeks. In May, letters were sent to GPs to try and ascertain the clinical status of around 1,000 patients.

BHIT must be forbidden from real-world deployments, and fixed rapidly or dismantled (as Imperial College Healthcare NHS Trust appears to be doing), although the "solution" might be just as bad, or worse, than the disease.

-- SS

Friday, October 28, 2011

Cybernetik Über Alles: Computers Have More Rights Than Patients?

[Note: this essay contains many hyperlinks. They can be right-clicked and opened in a separate tab or window.]

What medical devices are shielded from liability?

Are there other examples of legislation seeking legal protections for wide-scale use of medical devices that even the device's trade group leadership admits are not ready, and are experimental?

Here we have a proposal from a member of the U.S. Congress to shield health IT software, a medical device (per FDA's Director of CDRH - the Center for Device and Radiological Health and others), and its users from liability through an apparently unique special accommodation.

This from iHealthBeat.org:
Thursday, October 27, 2011
On Wednesday, Rep. Tom Marino (R-Penn.) introduced legislation (HR 3239) that would create certain legal protections for Medicare and Medicaid providers who have implemented electronic health record systems, the Wilkes-Barre Times Leader reports.
The bill -- called the Safeguarding Access for Every Medicare Patient Act -- would create a system for reporting potential medical errors that occur when using EHRs, but it would not allow such information to be used as legal admission of wrongdoing.
The bill would cover certain physicians and hospitals that serve Medicare and Medicaid beneficiaries. It also would cover participants and users of health information exchanges.
Marino, who is a member of the House Judiciary Committee, said that offering the new legal protections to health care providers would promote greater use of EHRs and encourage Medicare and Medicaid providers to continue serving beneficiaries. [As if they could not do so without EHR's? - ed.]
He said, "Many providers are reluctant to use [EHRs] because they believe the practice will make them more vulnerable to unnecessary legal action," [unnecessary? How about real and necessary, as per the White Paper Do EHR's Increase Liability? - ed] adding, "This [bill] protects access for seniors in the Medicare and Medicaid programs" (Riskind, Wilkes-Barre Times Leader, 10/27).

From Rep. Marino's website (my comments are in [bracketed red italics]):


Marino Introduces Safeguarding Access For Every Medicare Patient Act
FOR IMMEDIATE RELEASE
Oct. 26, 2011
WASHINGTON -- U.S. Rep. Tom Marino, PA-10, has introduced legislation that offers limited legal protection to Medicare and Medicaid providers who use electronic records. [Which, I fear, could effectively act as, or mutate into, absolute protection in the environs of the legal system - ed.]
HR3239, the Safeguarding Access For Every Medicare Patient Act, would ensure patient access to Medicare and Medicaid providers; reduce health care costs [really? That's not what Wharton and others write - ed.]; guarantee incentives to providers to remain in the Medicare and Medicaid programs; and promote participation in health information technology.
Providers will eventually be required to participate in electronic recordkeeping or face a reduction in payments.
Marino said the bill offers incentive in the form of legal protection to providers who may be reluctant to remain in the Medicare and Medicaid programs due to low reimbursement rates which are constantly being targeted for further reductions.
[I imagine the known risks of health IT such as these at "MAUDE and HIT Risks: What in God's Name is Going on Here?" are a minor consideration if you receive legal immunity - ed.]
HR3239 would create a system for reporting potential errors that occur when using electronic records without the threat of that information being used as an admission of guilt. [Even if the physician or nurse is guilty of EHR-caused or aggravated, i.e., "use error" per NIST, malpractice - ed.]
It also prevents electronic records from being used as an easy source for “fishing expeditions,’’ [like this case, this case, this case and this case where patients died? - ed.] while making sure that parties responsible for errors are held accountable [how? -ed].
The proposal allows for providers who use electronic records to take remedial measures without having those actions be used to establish guilt [even though remediation may be very relevant to malpractice, patient injury and death prior to the remediation, and the remediation is informed by the error - ed.]; places time limits on the filing of lawsuits; and offers protection against libel and slander lawsuits.
[If this provision were to allow clinicians to speak publicly about HIT flaws without legal retaliation or sham peer review, I'd be all for it - ed.]
“Many providers are reluctant to use electronic records because they believe the practice will make them more vulnerable to unnecessary legal action,” Marino said. [I think it's much more likely they are reluctant to use them due to the aforementioned hair-raising MAUDE reports and literature such as here, here and here - ed.] and “Every time a doctor or hospital chooses not to participate because of these fears, our seniors lose another provider. This protects access for seniors in the Medicare and Medicaid programs.”
Marino said HR3239 is a two-pronged attack against rising health care costs: It provides legal protection to providers while encouraging the use of health information technology which has been shown to reduce costs. [See above links on that issue - ed.]
“Best of all, passage of this bill would require no new spending,” Marino said. [Besides the hundreds of billions to be spent on the IT itself - ed.]

This sounds like a healthcare IT vendor marketing piece, with claims refuted repeatedly here at HC Renewal, usually via the biomedical literature. It's slick, purporting to "protect Medicare access" while actually promoting health IT sales.

Did Rep. Marino get snowed by the health IT lobby? (See "The Machinery Behind Healthcare Reform" in the Washington Post.)

A major question is:

What are the patients and their rights to redress for injuries that occur due to EHR's? Chopped liver?

Isn't this bill really saying that patients are experimental subjects with limited rights? In other words, that improving EHR's should be at the expense of the unfortunate patients treated under its auspices? That the computers have more rights than the patients?

That line of thinking about what in reality is unconsented medical experimentation (i.e., "First, let's experiment" as opposed to "First, do no harm") has led to some very dark places in medicine, and not just in ancient history (e.g., see "Bioethics panel blasts late U. Pittsburgh professor").

See this reading list for more on these issues. Also see the many other posts on this blog about health IT quality, usability, efficacy, risk (and that the levels of that risk are admittedly unknown), lack of informed consent, and other issues via query links such as here, here, here and here - and the hyperlinks within those lists of posts - to more fully understand this perspective.

The text of the proposed legislation is here. While not all bad, it raises a number of concerns.

Excerpts are as follows:

H. R. 3239

To provide certain legal safe harbors to Medicare and Medicaid providers who participate in the EHR meaningful use program or otherwise demonstrate use of certified health information technology.

... SEC. 4. RULES RELATING TO E-DISCOVERY.

    In any health care lawsuit against a covered entity that is related to an EHR-related adverse event, with respect to certified EHR technology used or provided by the covered entity, electronic discovery shall be limited to--

      [I'm not sure what "certification" has to do with litigation, since "certification" of health IT has nothing to do with safety or usability; see note below - ed.]

      (1) information that is related to [what does that mean? - ed.] such EHR-related adverse event; and
      (2) information from the period in which such EHR-related adverse event occurred.

      [eDiscovery related to EHR-related adverse events is already difficult, e.g., obtaining complete metadata. What these provisions would do is likely to increase the complications through legal maneuvers on terms such as"related to", "period" etc. - ed.]

SEC. 5. LEGAL PROTECTIONS FOR COVERED ENTITIES.

    (a) General- For a covered entity described in section 2, the following protections apply:
      (1) ENCOURAGING SPEEDY RESOLUTION OF CLAIMS-
        (A) GENERAL- A claimant may not commence a health care lawsuit against a covered entity on any date that is 3 years after the date of manifestation of injury or 1 year after the claimant discovers, or through the use of reasonable diligence should have discovered, the injury, whichever occurs first. This limitation shall be tolled to the extent that the claimant is able to prove--
          (i) fraud;
          (ii) intentional concealment; or
          (iii) the presence of a foreign body, which has no therapeutic or diagnostic purpose or effect, in the person of the injured person.
      ... (2) EQUITABLE ASSIGNMENT OF RESPONSIBILITY- In any health care lawsuit against a covered entity--
        (A) each party to the lawsuit other than the claimant that is such a covered entity shall be liable for that party's several share of any damages only and not for the share of any other person and such several share shall be in direct proportion to that party's proportion of responsibility for the injury, as determined under clause (iii);
        (B) whenever a judgment of liability is rendered as to any such party, a separate judgment shall be rendered against each such party for the amount allocated to such party [does that include the IT vendor? - ed.] ; and
        (C) for purposes of this paragraph, the trier of fact shall determine the proportion of responsibility of each such party for the claimant's harm.
      (3) SUBSEQUENT REMEDIAL MEASURES- Evidence of subsequent remedial measures to an EHR-related adverse event with respect to certified EHR technology used or provided by the covered entity (including changes to the certified EHR system, additional training requirements, or changes to standard operating procedures) by a covered entity shall not be admissible in health care lawsuits.

      [This in and of itself seems to give special accommodation to health IT, since remediation helps make the case for the presence of problems to begin with - ed.]
      (4) INCREASED BURDEN OF PROOF PROTECTION FOR COVERED ENTITIES- Punitive damages may, if otherwise permitted by applicable State or Federal law, be awarded against any covered entity in a health care lawsuit only if it is proven by clear and convincing evidence that such entity acted with reckless disregard for the health or safety of the claimant. In any such health care lawsuit where no judgment for compensatory damages is rendered against such entity, no punitive damages may be awarded with respect to the claim in such lawsuit.

      [Would that apply to a case such as this? Does it apply to the health IT vendors and their often cavalier software development and quality practices
      , if patients become injured, such as here, "A Study of an Enterprise Health Information System?" How about to this case, "A Lawsuit Over Healthcare IT Whistleblowing?" - ed.]

      (5) PROTECTION FROM LIBEL OR SLANDER- Covered entities and employees, agents and representatives of covered entities are immune from civil action for libel or slander arising from information or entries made in certified EHR technology and for the transfer of such information to another eligible provider, hospital or health information exchange, if the information, transfer of information, or entries were made in good faith and without malice.

      [Does that include defects reports? - ed.]


    From an ethical perspective, when you know a technology can be unsafe, but you don't know the levels of risk it creates, and the literature is conflicting on the benefits (prima facie evidence the technology is still experimental), you do not promote its wide-scale use in medicine and offer special accommodations to the technology's producers and users. Period. This is especially true without explicit patient informed consent and opportunity for opt-out. To promote such technology is not ethical.

    Note: I believe the misunderstanding of "certification" of health IT contributes to the problems with such proposals. "Certification" of HIT has little if anything to do with safety, reliability, usability, etc. (e.g,, see http://hcrenewal.blogspot.com/2010/03/on-oncs-proposed-establishment-of.html).

    "Certification" of health IT is not validation of safety, usability, efficacy, etc., but a pre-flight checklist of features, interoperability, security and the like. The certifiers admit this explicitly. See the CCHIT web pages for example. ("CCHIT Certified®, an independently developed certification that includes a rigorous inspection of an EHR’s integrated functionality, interoperability and security.")

    Health IT "certification" is not like Underwriters Laboratories (UL) certification of appliances. ("Independent, not-for-profit product safety testing and certification organization ... With more than a 116-year proven track record, UL has been defining safety from the public adoption of electricity to new breakthroughs that help protect our future. UL employees are committed to safeguarding people, places and products in new and innovative ways for today’s borderless world.")

    -- SS

    10/28/11 Addendum:

    This Representative seems to represent districts in Pennsylvania served by the Geisinger healthcare system, including Danville, PA where their main campus is located. His legislative assistant on healthcare represented Geisinger to me in a conversation today in glowing terms. However, I suggest that Geisinger does not have a perfect track record, e.g., see the post "A 'safe' technology? Factors contributing to an increase in duplicate medication order errors after CPOE implementation" and its reader comments and links.

    10/30/11 Addendum:

    It occurred to me that in the post "Is Healthcare IT a Solution to the Wrong Problem?" referencing a study published in the Nov. 25, 2010 New England Journal of Medicine entitled "Temporal Trends in Rates of Patient Harm Resulting from Medical Care" [Landrigan N Engl J Med 363;22] I pointed out that the abilities of health IT to "reduce medical error" may be significantly less than imagined.

    This is because most medical errors have little to do with record keeping, but instead with human factors. See the post at http://hcrenewal.blogspot.com/2010/12/is-healthcare-it-solution-to-wrong.html.
    -- SS