Showing posts with label IOM Committee on Patient Safety and Health Information Technology. Show all posts
Showing posts with label IOM Committee on Patient Safety and Health Information Technology. Show all posts

Thursday, March 20, 2014

Forbes contributor Nicole Fisher, claimed proponent of healthcare human rights, writes long article on EHRs that neglects to mention patient harm

The field of health IT lives a charmed life.

Even self-stated human rights advocates in healthcare won't seem to broach the topic of EHR-related harms and lack of informed consent processes regarding their use in a patient's care, even when authors openly state their concerns for patient and human rights.

This is despite known harms at potentially alarming levels as indicated by organizations including the IOM, FDA, ECRI Institute, AHRQ, Harvard Med Mal insurer CRICO, The Joint Commission, NHS in the UK, and others.

Forbes contributor Nicole Fisher (http://www.forbes.com/sites/nicolefisher/) published an article in Forbes dated 3/18/2014 and entitled "Electronic Health Records - Expensive, Disruptive And Here To Stay" (http://www.forbes.com/sites/nicolefisher/2014/03/18/electronic-health-records-expensive-disruptive-and-here-to-stay/).

You can read the article at the above link.  It focuses on physician distraction and dissatisfaction, poor usability and usefulness, expense, and related issues.  Of course, quotes from HHS figure prominently.

Nowhere, however, is mention of EHR-related harms to patients, such as reported by the aforementioned organizations mentioned - as in this very small sample list of posts, which contain links to source:

Patient Safety & Quality Healthcare: "CRICO Malpractice Claims Analysis Confirms Risks in EHRs"
hcrenewal.blogspot.com/2014/02/patient-safety-quality-healthcare.html

Peering Underneath the Iceberg's Water Level: ECRI "Deep Dive" Study of Health IT "Events"
http://hcrenewal.blogspot.com/2013/02/peering-underneath-icebergs-water-level.html

Internal FDA memorandum on HIT risks
http://hcrenewal.blogspot.com/2010/08/smoking-gun-internal-fda-memorandum-of.html

IOM Report - "Health IT and Patient Safety: Building Safer Systems for Better Care"
http://hcrenewal.blogspot.com/2011/11/iom-report-on-health-it-safety-nix-fda.html

EHRs and Deadly glitches
http://hcrenewal.blogspot.com/search/label/glitch (multiple posts)

Not to forget issues of breach of privacy: 
http://hcrenewal.blogspot.com/search/label/medical%20record%20privacy (multiple posts)

Nor are mentioned the issues of the experimental, unregulated nature of this technology, and the lack of any informed consent process regarding patient's rights to decide whether or not to have these systems used in their care.

The only mention of "unintended consequences" links to "Unintended ICD-10 Consequences: Inadequate Clinical Documentation Can Negatively Impact Physician Profiles."

This is in my view disappointing, and seems to be yet another extraordinary healthcare IT industry accommodation, remarkable for an author with the following bio (emphases mine):

http://www.forbes.com/sites/nicolefisher/

Nicole Fisher is the Founder and Principal at HHR Strategies, a health care and human rights focused advising firm. Additionally, she is a Senior Policy Advisor and health policy expert on health economic analyses mainly focusing on Medicare, Medicaid and health reform, specifically as they impact women and children. Nicole runs a Health Innovation and Policy page at Forbes.com highlighting and advising companies, ideas and people that are changing the health care landscape. She is also currently pursuing her PhD at the University of North Carolina in the Health Policy and Management Department. Her writing has appeared in other publications such as Health Affairs, Wall Street Journal, Washington Post, Centers for Medicare & Medicaid Services Journal, Wright on Health, The Health Care Blog and Health Services Research. Before pursuing her PhD in health policy, Nicole earned her Master’s degree in Public Policy from the University of Chicago and her undergraduate degree from the University of Missouri. Her health care and policy work at those institutions had an emphasis on underserved populations, women's and children’s issues. She presides on several Boards for domestic and international health organizations and frequently speaks on health reform and human rights.

One might conclude the computer has more rights than the patient and the clinician via this Forbes piece.

Finally, although I am a domain specialist (http://www.kaiserhealthnews.org/stories/2013/february/18/scot-silverstein-health-information-technology.aspx), I am getting rather tired of having to point out the obvious to major media outlets and writers, in essence doing their homework for them.  I'm sure other bloggers feel the same way. 

The title of the Forbes piece should have been "Electronic Health Records - Expensive, Disruptive, Deadly."

To the author of the Forbes piece and other HIT writers, here is the face (and gravestone) of someone injured by this "disruptive" technology.   http://hcrenewal.blogspot.com/2011/06/my-mother-passed-away.html

A few babies too:  http://hcrenewal.blogspot.com/2011/06/babys-death-spotlights-safety-risks.html

A family man:  http://hcrenewal.blogspot.com/2011/09/sweet-death-that-wasnt-very-sweet-how_24.html

I know of others from my legal work supporting the EHR-related injured and deceased that I cannot mention.  And I am but one person.

If health IT were causing rapes** and child abuse, rather then merely causing mundane severe injuries and equally mundane deaths, would the media would pay more attention?

-- SS

** I sadly note that EHR's without proper security measures in force actually did enable rape-like behavior in 2011, as at "EHR as Molestation Candidate Selector: What was this Resident looking for in the EHR before 'examining' female patients?" at http://hcrenewal.blogspot.com/2011/02/what-was-this-medical-resident-looking.html.

Tuesday, July 02, 2013

Is ONC's definition of "Significant EHR Risk" when body bags start to accumulate on the steps of the Capitol?

In a June 25, 2013 Bloomberg News article "Digital Health Records’ Risks Emerge as Deaths Blamed on Systems" by technology reporter Jordan Robertson (http://go.bloomberg.com/tech-blog/author/jrobertson40/),  Mr. Robertson wrote:

... “So far, the evidence we have doesn’t suggest that health information technology is a significant factor in safety events,” said Jodi Daniel (http://www.healthit.gov/newsroom/jodi-daniel-jd-mph), director of ONC’s office of policy and planning. “That said, we’re very interested in understanding where there may be a correlation and how to mitigate risks that do occur.”

In my opinion this statement represents gross negligence by a government official.  Ms. Daniel is unarguably working for a government agency pushing this technology.   She makes the claim that "so far the evidence we have doesn't suggest significant risk" while surely being aware (or having the fiduciary responsibility to be aware) of the impediments to having such evidence.

From my March 2012 post "Doctors and EHRs: Reframing the 'Modernists v. Luddites' Canard to The Accurate 'Ardent Technophiles vs. Pragmatists' Reality" at http://hcrenewal.blogspot.com/2012/03/doctors-and-ehrs-reframing-modernists-v.html  (yes, this was more than a year ago):

... The Institute of Medicine of the National Academies noted this in their late 2011 study on EHR safety:


... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]

Also in the IOM report:

… “For example, the number of patients who receive the correct medication in hospitals increases when these hospitals implement well-planned, robust computerized prescribing mechanisms and use barcoding systems. But even in these instances, the ability to generalize the results across the health care system may be limited. For other products— including electronic health records, which are being employed with more and more frequency— some studies find improvements in patient safety, while other studies find no effect.

More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.”


I also noted that the 'impediments to generating evidence' effectively rise to the level of legalized censorship, as observed by Koppel and Kreda regarding gag and hold-harmless clauses in their JAMA article "Health Care Information Technology Vendors' Hold Harmless Clause: Implications for Patients and Clinicians", JAMA 2009;301(12):1276-1278. doi: 10.1001/jama.2009.398.

FDA had similar findings about impediments to knowledge of health IT risks, see my Aug. 2010 post "Internal FDA memorandum of Feb. 23, 2010 to Jeffrey Shuren on HIT risks. Smoking gun?" at http://hcrenewal.blogspot.com/2010/08/smoking-gun-internal-fda-memorandum-of.html.

I also note this from amednews.com's coverage of the ECRI Deep Dive Study (http://hcrenewal.blogspot.com/2013/02/peering-underneath-icebergs-water-level.html):


... In spring 2012, a surgeon tried to electronically access a patient’s radiology study in the operating room but the computer would show only a blue screen. The patient’s time under anesthesia was extended while OR staff struggled to get the display to function properly. That is just one example of 171 health information technology-related problems reported [voluntarily] during a nine-week period [from 36 hospitals] to the ECRI Institute PSO, a patient safety organization in Plymouth Meeting, Pa., that works with health systems and hospital associations in Kentucky, Michigan, Ohio, Tennessee and elsewhere to analyze and prevent adverse events. Eight of the incidents reported involved patient harm, and three may have contributed to patient deaths, said the institute’s 48-page report, first made privately available to the PSO’s members and partners in December 2012. The report, shared with American Medical News in February, highlights how the health IT systems meant to make care safer and more efficient can sometimes expose patients to harm.


One wonders if Ms. Daniels' definition of "significant" is when body bags start to accumulate on the steps of the Capitol.

I also note she is not a clinician but a JD/MPH.

I am increasingly of the opinion that non-clinicians need to be removed from positions of health IT leadership at regional and national levels.

In large part many just don't seem to have the experience, insights and perhaps ethics necessary to understand the implications of their decisions.

At the very least, such people who never made it to medical school or nursing school need to be kept on a very short leash by those who did.

-- SS

Thursday, February 14, 2013

Bipartisan Policy Center's Health Innovation Initiative: Health IT Industry Officials Lying to Regulators With Impunity?

On Wednesday, February 13, 2013, The Bipartisan Policy Center's Health Innovation Initiative held a discussion on its new report: An Oversight Framework for Assuring Patient Safety in Health Information Technology.  The announcement is here:  https://bipartisanpolicy.org/news/press-releases/2013/02/bipartisan-policy-center-releases-recommendations-oversight-framework-pa

The report is here (PDF):  "An Oversight Framework for Assuring Patient Safety in Health Information Technology."

The "who's" of the Bipartisan Policy Center's Health Innovation Initiative included these people:

  • Senator Tom Daschle, Former U.S. Senate Majority Leader; Co-founder, Bipartisan Policy Center (BPC); and Co-leader BPC Health Project Carolyn M. Clancy, M.D., Director, Agency for Healthcare Research and Quality, Department of Health and Human Services
  • Farzad Mostashari, M.D., ScM, National Coordinator for Health Information Technology, Department of Health and Human Services
  • Peter Angood, M.D., Chief Executive Officer, American College of Physician Executives
  • Russ Branzell, Chief Executive Officer, Colorado Health Medical Group, University of Colorado Health
  • John Glaser, Ph.D., Chief Executive Officer, Siemens Health Services
  • Douglas E. Henley, M.D., FAAFP, Executive Vice President and Chief Executive Officer, American Academy of Family Physicians
  • Jeffrey C. Lerner, Ph.D., President and Chief Executive Officer, ECRI Institute
  • Ed Park, Executive Vice President and Chief Operating Officer, athenahealth
  • Emad Rizk, M.D., President, McKesson Health Solutions
  • Janet Marchibroda, Moderator; Director, BPC Health Innovation Initiative 

Unfortunately, I was unable to attend.  I was at the 2013 Annual Winter Convention of the American Association for Justice (Trial Lawyer's Association) in Florida, as an invited speaker on health IT risk, its use in evidence tampering, and other legal issues.


"United for Justice" - click to enlarge



I found the following statement from the Bipartisan Policy Center's Health Innovation Initiative report remarkable as a "framework for health IT safety":

The Bipartisan Policy Center today proposed an oversight framework for assuring patient safety in health information technology. Among other guiding principles, the framework should be risk-based, flexible and assure patient safety is a shared responsibility, the authors said. “Assuring safety in clinical software in particular is a shared responsibility among developers, implementers, and users across the various stages of the health IT life cycle, which include design and development; implementation and customization; upgrades, maintenance and operations; and risk identification, mitigation and remediation,” the report states. Among other recommendations, the center said clinical software such as electronic health records and software used to inform clinical decision making should be subject to a new oversight framework, rather than traditional regulatory approaches [e.g.,  FDA - ed.] applied to medical devices given its lower risk profile.

I find it remarkable that the health IT industry and its supporters now feel they can lie to our government and regulatory agencies with impunity.  Stating that health IT has a "lower risk profile" is an example.

One cannot know what is acknowledged to be unknown.

From the Institute of Medicine in its 2012 report on health IT safety:

Institute of Medicine. 2012. Health IT and Patient Safety: Building Safer Systems for Better Care .  Washington, DC: The National Academies Press.

... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.

... More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.” 

Even to those with particularly thick skulls, this statement seems easy to comprehend:

"The magnitude of the risk associated with health IT is not known."

I repeat once again:

One cannot know what is acknowledged to be unknown.

A statement that health IT has a "lower risk profile" compared to other regulated healthcare sectors such as devices or drugs, in order to seek continued and extraordinary regulatory accommodations, is remarkable.  It is either reckless regarding something that the statement's makers should know, or should have made it their business to know - or a deliberate prevarication with forethought.

The report did attempt to shroud the declarative "lower risk profile" in a sugar coating through misdirection, citing the need to take into account "several factors" including:

"the level of risk of potential patient harm, the degree of direct clinical action on patients, the opportunity for clinician involvement, the nature and pace of its development, and the number of factors beyond the development stage that impact its level of safety in implementation and use." 

These "factors" speak to a higher level of potential risk, not lower, and are a justification for stronger regulatory oversight, not weaker.  I would opine that there is a possibility that health IT. through which almost all transactions of care need to pass (e.g., orders, results reporting, recording and review of observations, finding, diagnoses, prognoses, treatment plans, etc.), could have a higher risk profile than one-off devices or drugs.  Health IT affects every patient, not just those under a specific therapy or using a specific device or drug.

Partial taxonomies developed from limited data themselves speak to the issue of a potentially huge risk profile of health IT, e.g., the FDA Internal Memo on HIT Risks (link), the AHRQ Hazards Manager taxonomy (link), and the sometimes hair-raising voluntary defects reports (largely from one vendor) in the FDA MAUDE database (link).  Further, health IT can and does affect thousands or tens of thousands of patients en masse even due to one simple defect, such as happened in Rhode Island at Lifespan (link), or due to overall design and implementation problems such as at Contra Costa County, CA (link) and San Francisco's Dept. of Public Health (link).

We don't know the true levels of risk and harm - but we need to, and rapidly.  Industry self-policing is not the answer; it didn't work in drugs and devices, and even with regulation there are still significant problems in those sectors.  (Imagine how it would be if those sectors received the special accommodations that health IT receives, and wishes to continue to receive.)

My other issue is with the "shared responsibility" including "users."

The user's responsibility is patient care, not being a beta tester for bug-laden or grossly defective health IT products.  Their responsibility ends at reporting problems without retaliation, and ensuring patient safety.

Their responsibility is to avoid carelessness - as it is when they drive their cars.

In other words, the inclusion of "users" in the statement is superfluous.

It is not a responsibility to be omniscient and be held accountable when bad health IT promotes "use error" (the NIST definition of "use error" I will not repeat again here; search the blog) -- as opposed to and as distinct from "user error" - note the final "r" - i.e., carelessness.

Bad health IT (see here):

Bad Health IT ("BHIT") is defined as IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation. 

One special accommodation that the health IT industry has been afforded for far too long is to be able to "blame the user."

"Blaming the victim" of bad health IT is a more appropriate description.

-- SS

Wednesday, September 26, 2012

HIMSS Senior Vice President on Medical Ethics: Ignore Health IT Downsides for the Greater Good

The Health Information Management Systems Society (HIMSS) is the large health IT vendor trade group in the U.S.  At a Sept. 21, 2012 HIMSS blog post, John Casillas, Senior Vice President of HIMSS Financial-Centered Systems and HIMSS Medical Banking Project dismisses concerns about health IT with the refrain:

... To argue that the existence of something good for healthcare in many other ways, such as having the right information at the point of care when it’s needed, is actually bad because outliers use it to misrepresent claims activity is deeply flawed.

Through the best use of health IT and management systems, we have the opportunity to improve the quality of care, reduce medical errors and increase patient safety. Don’t let the arguments of some cast a cloud over the critical importance and achievement of digitizing patient health records.

Surely, no one can argue paper records are the path forward. Name one other industry where this is the case. I can’t.

Let’s not let the errors of a few become the enemy of good.

The ethics of these statements from a non-clinician are particularly perverse.

The statement "Don’t let the arguments of some cast a cloud over the critical importance and achievement of digitizing patient health records" is particularly troubling.

When those "some" include organizations such as FDA (see FDA Internal 2010 memo on HIT risks, link) and IOM's Committee on Patient Safety and Health Information Technology (see 2012 report on health IT safety, link) both stating that harms are definite but magnitude unknown due to systematic impediments to collecting the data, and the ECRI Institute having had health IT in its "top ten healthcare technology risks" for several years running, link, the dismissal of "clouds" is unethical on its face.

These reports indicate that nobody knows if today's EHRs improve or worsen outcomes over good paper record systems or not.  The evidence is certainly conflicting (see here).

It also means that the current hyper-enthusiasm to roll out this software nationwide in its present state could very likely be at the expense of the unfortunate patients who find themselves as roadkill on the way to the unregulated health IT utopia.

That's not medicine, that's perverse human subjects experimentation without safeguards or consent.

As a HC Renewal reader noted:

Astounding hubris, although it does seem to be effective.  Such is PC hubris.  Who could ever call for reducing the budget of the NIH that is intended to improve health.  Has health improved?  No.

So why does a group with spotty successes if not outright failure never get cut?  It’s not the results, it’s the mission that deserves the funding.  So it’s not the reality of HIT, it’s the promise, the mission, that gets the support.  Never mind the outcome, it’s bound to improve with the continued support of the mission.

Is this HIMSS VP aware of these reports?  Does he even care?

Does he believe patients harmed or killed as a result of bad health IT (and I know of a number of cases personally through my advocacy work, including, horribly, infants and the elderly) are gladly sacrificing themselves for the greater good of IT progress?

It's difficult to draw any other conclusion from health IT excuses such as proffered, other than he and HIMSS simply don't care about unintended consequences of health IT.

Regarding "Surely, no one can argue paper records are the path forward" - well, yes, I can.  (Not the path 'forward', but the path for now, at least, until health IT is debugged and its adoption and effects better understood).  And I did so argue, at my recent posts "Good Health IT v. Bad Health IT: Paper is Better Than The Latter" and "A Good Reason to Refuse Use of Today's EHR's in Your Health Care, and Demand Paper".  I wrote:

I opine that the elephant in the living room of health IT discussions is that bad health IT is infrequently, if ever, made a major issue in healthcare policy discussions.

I also opine that bad health IT is far worse, in terms of diluting and decreasing the quality and privacy of healthcare, than a very good or even average paper-based record-keeping and ordering system.  


This is a simple concept, but I believe it needs to be stated explicitly. 

A "path forward" that does not take into account these issues is the path forward of the hyper-enthusiastic technophile who either deliberately ignores or is blinded to technology's downsides, ethical issues, and repeated local and mass failures.

If today's health IT is not ready for national rollout, e.g., causes harms of unknown magnitude (e.g., see this query link), results in massive breaches of security as the "Good Reason" post above, and mayhem such as at this link, then:

The best - and most ethical - option is to slow down HIT implementation and allow paper-based organizations and clinicians to continue to resort to paper until these issues are resolved.  Resolution needs to occur in lab or experimental clinical settings without putting patients at risk - and with their informed consent.

Anything else is akin to the medical experimentation abuses of the past that led to current research subjects protections such as the "Ethical Guidelines & Regulations" used by NIH.

-- SS

Thursday, June 14, 2012

Ellmers Calls on Sebelius to Address Health IT Safety Concerns: A Responsible Voice in Government on Health IT and HIT Safety

The following press release is very welcome, and speaks for itself.  There is a responsible voice in the government wilderness.  It is perhaps no surprise it comes from a Congresswoman who is also a registered nurse:

Ellmers Calls on Sebelius to Address Health IT Safety Concerns



Safety Risks and Health IT-Related Errors Cited in IOM Recommendations

WASHINGTON – House Small Business Subcommittee on Healthcare and Technology Chairwoman Renee Ellmers (R-NC) today sent a letter to Kathleen Sebelius, Secretary of Health and Human Services (HHS), inquiring about whether the Department has adopted the Institute of Medicine’s (IOM) recommendations for improving the safety of health information technology (IT).
The report, issued in November, recommended several steps to be taken by HHS and called for greater oversight by the public and private sectors. The Secretary was called upon by the IOM to issue a plan within 12 months to minimize patient safety risks associated with health IT and report annually on the progress being made.  The report further recommended that the plan should include a schedule for working with the private sector to assess the impact of health IT on patient safety, and recommended several other steps to help improve the safety of health IT.

Specifically, Chairwoman Ellmers has requested a copy of the Secretary’s plan to minimize patient safety risks, a description of health IT-related errors that have resulted in patient risks, injuries and deaths, and the status of the development of a mechanism for health IT vendors and users to report health IT-related deaths.  She said that because health IT has the promise to improve health care delivery for patients, physicians and other medical professionals, she remains eager to work with the Secretary to ensure that health IT is safe, effective and affordable.

In an August 11, 2011 letter to Secretary Sebelius, Chairwoman Ellmers said that a modern, well-equipped office is critical to the practice of medicine, and asked the Secretary to undertake a study of health IT’s adoption, benefits and cost effectiveness, including medical error rates.

On June 2, 2011, Chairwoman Ellmers’ Subcommittee held a hearing on the barriers to health IT that are encountered by physicians and other health professionals in small and solo practices.   At the hearing, physicians expressed strong concerns about the cost of purchasing and maintaining health IT systems, as well as the staff training and downtime necessary to implement such a system.  Chairwoman Ellmers noted health IT’s great potential to improve health care delivery, decrease medical errors, increase clinical and administrative efficiency and reduce paperwork.

For more than twenty-one years before being elected to Congress, Chairwoman Ellmers served as a registered nurse, focusing on surgical care as Clinical Director of the Trinity Wound Care Center and later helping to manage the family's small medical practice with her husband, Dr. Brent Ellmers, a licensed surgeon. As a registered nurse and the wife of a surgeon, Ellmers understands that a modern, efficient and well-equipped office is critical to the practice of medicine.    

This voice of sanity is quite welcome.  I've spoken with Rep. Ellmers' office, pointing them to my Drexel Univ. writings and materials and recommending Sebelius' reply be gone over with a fine-toothed comb, from the perspective of health IT realities, not merely from the perspective of the Ddulite's good intentions.  (I also introduced her staffer to the concept of the Ddulite, the HIT hyper-enthusiast who ignores all downsides and ethical concerns.)

I also pointed out the ethical lapse in IOM's position of "wait and see" while HIT is pushed nationally under penalty of law, at the cost of hundreds of billions of dollars, when their own report (along with reports from FDA here, JC here and others) admits they don't know the magnitude of benefits, risks and harms:

... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.

Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.
[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]

As I wrote in my Nov. 2011 post "IOM Report - 'Health IT and Patient Safety: Building Safer Systems for Better Care' - Nix the FDA; Create a New Toothless Agency", the IOM's response to their own study was reckless and unethical (at best):

... The panel also recommends that the HHS secretary publicly report on the progress of health IT safety each year, beginning in 2012. If the secretary determines at any time that adequate safety progress has not been made, only then should the FDA take the regulatory lead and be given the resources to do so, the report recommends, adding that the agency should be developing a framework now to be prepared.

In the meantime, during each year of "watching for safety progress", innumerable patients are exposed to HIT's hazards and costs.  Pharma and other medical device industries are afforded no such special accommodation.

-- SS

Thursday, December 01, 2011

Health IT Pundits and Perhaps the Most Logically Fallacious (And Even Cold-Hearted) Statement I've Seen About Health IT to Date

The KevinMD blog has reposted George Lundberg's MedPage Today post "Health IT: Garbage In, Garbage Out", retitled as "Health IT has problems, but is worth the price." I covered Dr. Lundberg's original post at my Nov. 16, 2011 essay "George Lundberg, MD: The Promise of Health IT, and a Caveat."

As the KevinMD blog is exceptionally well-read, I expected the HIT pundits to come in with "see no evil, hear no evil, speak no evil" accolades for the technology.

I was right, even early on.

Keep in mind that Dr. Lundberg specifically quotes me in his article:

... However, there is another harsh critic worth listening to.

His name is Dr. Scot Silverstein, and he seems to have made it his life's work to call attention to really
bad problems that he discovers in this mass move to automation.

Heed his cautions. They are real.

My writings and opinions are known of by the following pundit, including the fact that my relative was injured as a result of the technology, who commented on KevinMD's reposting of the Lundberg essay. The comment is here (I do not personally know the commenter, only having exchanged numerous back-and-forth comments on a few health IT blogs in the past):

Margalit Gur-Arie [a partner at EHR pathway, LLC and Gross Technologies, Inc. - ed.]

Wow! There's something to be said for extreme statements, whether right or wrong.

... do EHRs kill people? Probably, but every single item used in medicine can be shown to have killed people at one time or another, depending on how you define "killed" [1]. Do more people get harmed where EHRs are present, compared to where they are not? There are no conclusive studies to that effect and there are no conclusive studies showing the opposite either. There are not very good studies at all, but if mass murder was occurring, we would have probably known by now.

The appeal to ridicule and/or argumentum ad ignorantiam-like statement "if mass murder was occurring, we would have probably known by now" is both fallacious and egregious. Is that a criteria medicine uses, in the explicitly admitted situation where conclusive studies are lacking, to promote diffusion of some new treatment or tool? That is, since we don't note catastrophic levels of toxicity, the toxicity is of minor import?

On other logical fallacies, the statement that "every medical intervention can kill", implying that any morbidity and mortality due to EHRs is just a foregone conclusion, is doubly fallacious.

One fallacy is the absolute nature of the statement itself. It isn't true that 'all medical interventions can kill.' Another fallacy is the cavalier lack of distinction between a small vs. large risk of injury or death.

That said, even without considering 1) the literature aggregated here, 2) the context of the IOM Committee on Patient Safety and Health Information Technology's report that states the technology has risks, and worse, that impediments to information diffusion prevent the magnitude of the risks from being known (PDF available here), and 3) the context of my relative's travails, this is perhaps the most wishy-washy, ethically unsatisfying, cold-hearted excuse for health IT's problems -- and for reneging on fixing those problems before national rollouts -- that I have ever seen.

The argument is so bad, it's difficult to parse out the precise nature of all the logical fallacies contained within.

COI disclosure: I note that I have no associations with, receive no payments or royalties from, or have any other relationships with healthcare IT vendors, consultants or customers. I decided to offer my services as an expert witness for attorneys on health IT-related injuries and records tampering as a result of my relative's travails, however.


Note:

[1] "Depending on how you define "killed"? Let me take a stab at that (it should be easier than defining
what the meaning of the word 'is' is). How about "resulted directly or indirectly in the termination of all biological functions, as in, the patient's dead?"

-- SS

Wednesday, November 23, 2011

Two Opposing Views of EHR: InformaticMD vs. NextGen's Holder of "American Medical Informatics Certification for Health Information Technology"

The AMA's publication American Medical News recently quoted me following comments from IOM EHR Safety committee member Richard Cook in the Nov. 21, 2011 article "IOM calls for monitoring and probe of health IT hazards" by Kevin O'Reilly:

... Not everyone on the panel agreed with delaying FDA regulation. [Per the IOM report on health IT safety released Nov. 10, 2011, see here - ed.]

Committee member Richard I. Cook, MD, filed a dissent in the report in which he recommended that health IT systems be regulated as class III medical devices.

"It is quite remarkable that we're in this situation," said Dr. Cook, associate professor of anesthesia and critical care at the University of Chicago Pritzker School of Medicine. [Also, an expert in Medical Informatics - ed.] "It's not surprising that such adverse events are being found related to health IT, and it's not surprising that those promoting these systems have neither looked for them nor anticipated them. To make large-scale investments in these systems and only now be looking at the impact on patient safety borders on recklessness."

Scot M. Silverstein, MD, agreed.

"The bone I have to pick with the IOM report is that the action agenda is weak," said Dr. Silverstein, a consultant in medical informatics at the Drexel University College of Information Science and Technology in Pennsylvania.

It is unethical to expand health IT so dramatically without understanding the precise nature of the risks it poses to patients, Dr. Silverstein said.


Ironically, right below my statement was the following from HIT industry figure Charles Jarvis, blaming the user:

Users faulted

Leaders in the health IT industry also had their share of objections to some of the IOM panel's conclusions.

"We don't think there's a great deal of data to substantiate that there are major safety problems with the majority of electronic health records systems in use today," said Charlie Jarvis, executive committee vice chair of the EHR Assn., a trade group that represents 46 organizations that supply most of the EMR systems implemented in medical practices. "These products are safe, dependable, time-tested and display a lot of the safety features we think are necessary to prevent problems going forward."

Jarvis, also a vice president at the health IT firm NextGen, said vendors and the government should work to help physicians and other health professional users understand systems, take advantage of their safety features and avoid errors.

[Charitable translation: computers are infallible, so medical errors due to HIT are the user's fault, the Sept. 2011 National Institute of Standards and Technology (NIST) report on usability be damned. Clinicians should spend their valuable time learning to compensate for and then actually wading through mission hostile user experiences. If only those stupid doctors and nurses would use our cybernetic miracle tools the way we want, the members of the EHR Association could be making even more money. Oh, and by the way, the NIST's concept of "use error" [1] is nonsense. - ed.]


I presume the "EHR Assn." is the HIMSS EHR Association, with HIMSS itself being a gargantuan "cause-based, not-for-profit organization exclusively focused on providing global leadership for the optimal use of information technology and management systems for the betterment of healthcare":

The HIMSS Electronic Health Record (EHR) Association is a trade association of Electronic Health Record (EHR) companies, addressing national efforts to create interoperable EHRs in hospital and ambulatory care settings. The EHR Association operates on the premise that the rapid, widespread adoption of EHRs will help improve the quality of patient care as well as the productivity and sustainability of the healthcare system.

I observe that there are no conflicts of interest here that could cause Mr. Jarvis' stated opinions to be skewed towards the rights of computers and away from the rights of patients ... right?

First, Mr. Jarvis makes a logical error related to the error illustrated in my earlier post today "Magical Thinking on Health IT from ModernMedicine.com." His error is that of "proof by lack of evidence" [2]. No need to actually study the issue rigorously, despite repeated risk management-relevant incident reports (as opposed to the industry's preferred and highly erroneous term "anecdotes").

Just one recent, highly alarming example of an "anecdote" affecting probably tens of thousands of patients due to programming malpractice and grossly negligent quality assurance, at both vendor and end user hospitals, is illustrated here. Since it's an "anecdote", perhaps Mr. Jarvis would agree there's nothing to see there, so we should all move along.

(See the Aug. 2011 post "From a Senior Clinician Down Under: Anecdotes and Medicine, We are Actually Talking About Two Different Things" which puts the misuse of the "anecdotes" label in its proper place - the garbage can.)

Not only is "proof by lack of evidence" in the face of hair-raising incident and defects reports (e.g., as in FDA's MAUDE database) a prima facie logical fallacy unfitting in medicine, and in fact alien to medical ethics, but the IOM report specifically stated in no uncertain terms that nobody really knows the magnitude of the risks. This is due in part to numerous inhibitory factors in evidence diffusion. From the IOM report:

... Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.

Imagine such a situation in, say, the pharmaceutical, automotive, aviation, or nuclear power industries. The responsible individuals would likely be hauled off to jail.

However, this all may be irrelevant. After all, who can argue with the expert personal opinion of someone who holds "the American Medical Informatics Certification for Health Information Technology?" That astonishing credential could conceivably elevate Mr. Jarvis' opinion over all others - even mine, with my meager background in the domain.

I don't know if he still claims that credential, but he did as I described in my post about prior interactions with Mr. Jarvis and NextGen (dating back to 2004) in my Feb. 2009 post "NextGen and Vendor/Doctor Dialog: Yet Another Patronizing EHR Company of Certified HIT Experts?"

I guess the fact I'd never heard of such a qualification represents my dearth of familiarity with the field of Medical Informatics and healthcare information technology.

-- SS

Notes:

[1] “Use error” is a term used very specifically by NIST to refer to user interface designs that will engender users to make errors of commission or omission. It is true that users do make errors, but many errors are due not to user error per se but due to designs that are flawed, e.g., poorly written messaging, misuse of color-coding conventions, omission of information, etc. From "NISTIR 7804: Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records." It is available at http://www.nist.gov/healthcare/usability/upload/Draft_EUP_09_28_11.pdf (PDF).

[2] Example of proof by lack of evidence, courtesy Scott Adams: "I've never seen you drunk, so you must be one of those Amish people. "

More on these issues is at the site "Contemporary Issues in Medical Informatics: Common Examples of Healthcare Information Technology Difficulties."

Tuesday, March 22, 2011

How Academic and Government "Anecdotes Are Not Data" Ideologues Kill People

I'm already receiving comments that, regarding Prof. Jon Patrick's detailed exposé of the dangers of ill-suited-for-purpose ED EHR's, Patrick's observations are:

... not really valid because they're not peer reviewed; they're just anecdotal.

Only an egghead could pen such words.

I always get hives immediately after eating strawberries. But without a scientifically controlled experiment with all the right peer review, it's not reliable data. So I continue to eat strawberries every day, since I can't tell if they cause hives.


I'd already written about anecdotalist
refrains at my Mar. 7, 2011 post "Australian ED EHR Study: Putting the Lie to the Line "Your Evidence Is Anecdotal, Thus Worthless" Used by Eggheads, Fools and Gonifs." In that essay I cite Dr. Patrick himself on "anecdotal evidence", regarding which he hit the ball out of the Southern Hemisphere in an editorial in "Applied Clinical Informatics" entitled "The Validity of Personal Experiences in Evaluating HIT."

Aside from the fact that eggheads also don't seem to care about the issues of faulty peer review, especially in profitable biomedical sectors, such as at "
The Lancet Emphasizes the Threats to the Academic Medical Mission" with its embedded links, and "Has Ghostwriting Infected The Experts With Tainted Knowledge, Creating Vectors for Further Spread and Mutation of the Scientific Knowledge Base?", there's this simple fact:

Public health catastrophe warnings from responsible sources don't need peer review, they need investigation.

Yes, there were those pesky, off-narrative journalistic reports that the Japanese nuclear reactors were not entirely safe, that Bernie Madoff was a fraud, that mortgages for everyone was not a good idea, that the O-rings couldn't stand sub-freezing temperatures, that the the foam that broke off the Columbia launch tank caused a danger, and that the Titanic didn't have enough lifeboats, but they weren't peer reviewed...so we ignored them. Saved us a lot of money, too.


-- SS

Addendum:

At my post "
Real" Medical Informatics: What Does a Problem List of Typical Health IT Look Like, Part 2", I opined:

If the purpose of Medical Informatics is the improvement of healthcare (as opposed to career advancement of a small number of academics through publishing obscure articles about HIT benefits while ignoring downsides in rarified, echo-chamber peer reviewed journals), then:

  • Who are the "real" medical informatics specialists, and;
  • Who are the poseurs?

... researchers like Jon Patrick who address real-world issues of great import to patients on HIT risks, and further go public on the web with their work without the full blessings of some dusty journal (and those like Ross Koppel who also directly address the downsides, and others who make available to the public material such as on blogs like this and this, papers like this and sites like this) are the former.

Those who deem only "peer reviewed" articles worthy of daylight, and everything else - especially and particularly reports of downsides - "anecdotal" (the anecdotalists) are the latter.

I stand by this assertion.

Finally, I ask: at what point does ignoring work such as Prof. Patrick's, if patient harm is caused by the system he reviewed, constitute reckless endangerment and perhaps criminal negligence by hospital and government officials?

-- SS

Addendum Mar. 23:

As if on cue, this story appeared in the WSJ:

March 23, 2011

Japan Ignored Warning of Nuclear Vulnerability

TOKYO—Japanese regulators discussed in recent months the use of new cooling technologies at nuclear plants that could have lessened or prevented the disaster that struck this month when a tsunami wiped out the electricity at the stricken Fukushima Daiichi power facility.

However, they chose to ignore the vulnerability at existing reactors and instead focused on fixing the issue in future ones, government and corporate documents show. There was no serious discussion of retrofitting older plants with the alternative technology

I guess the "vulnerability reports" just weren't peer reviewed, therefore meaningless - or not reviewed by the "right" peers.

This sounds like our own FDA, ONC office and Institute of Medicine (via the Committee on Patient Safety and Health Information Technology), "choosing to ignore" health IT "vulnerabilities" (such as the aforementioned) and focusing on future issues such as comparative effectiveness research, "the common good", etc. instead.

I call this attitude "reckless endangerment" and hope plaintiff attorneys are paying close attention.

-- SS

Tuesday, March 01, 2011

IOM Committee on Patient Safety and Health IT, Meeting Two: Institute of Medicine, or of Mediocrity?

In my Jan. 2011 post "Institute of Medicine Committee on Patient Safety and Health Information Technology, and Thoughts on Social Aspects of Health IT Evaluation" I wrote that:

The U.S. National Research Council of the National Academy of Sciences issued a report in early 2009 on the state of health IT. That study's report, led in part by pioneers in Medical Informatics G. Octo Barnett and William Stead, was entitled "Computational Technology for Effective Health Care: Immediate Steps and Strategic Directions" (pre-publication PDF available free at this link). The report was announced under the following header:

CURRENT APPROACHES TO U.S. HEALTH CARE INFORMATION TECHNOLOGY ARE INSUFFICIENT

The insufficiencies were largely in the areas of difficulties with data sharing and integration, deployment of new IT capabilities, large-scale data management, and lack of cognitive support by health IT for busy clinicians.

One might reasonably conclude such deficits could affect patient safety.

Recently the Institute of Medicine (the health arm of the National Academy of Sciences) formed a Committee to study health IT safety. It held its first meeting on Dec. 14, 2010 (quite a few years late in my opinion, and only after tens of billions of dollars have been earmarked for health IT, but better late than never):

The Institute of Medicine Committee on Patient Safety and Health Information Technology is holding its first meeting on December 14-15, 2010. The first day, December 14, 2010 beginning at 10:30 am, is open to the public to observe the committee proceedings. The committee will hear presentations by the Office of the National Coordinator and other invited guests. There will also be an opportunity for members of the public and representatives of interested organizations to make a brief statement before the committee. Prior registration is requested for attendees and required for those wishing to make a statement.

Here are links to the PPT presentations from Meeting 2 of the Committee on Patient Safety and Health IT that took place Feb. 24, 2011:

http://www.iom.edu/~/media/Files/Activity%20Files/Quality/Patient%20Safety%20and%20HIT/Meeting%202/Dwork.pdf

http://www.iom.edu/~/media/Files/Activity%20Files/Quality/Patient%20Safety%20and%20HIT/Meeting%202/WoodsNormanFeb2011.pdf

http://www.iom.edu/~/media/Files/Activity%20Files/Quality/Patient%20Safety%20and%20HIT/Meeting%202/Harper%20IOM%20HIT%20Patient%20Safety.pdf

http://www.iom.edu/~/media/Files/Activity%20Files/Quality/Patient%20Safety%20and%20HIT/Meeting%202/Chrisman-.pdf

http://www.iom.edu/~/media/Files/Activity%20Files/Quality/Patient%20Safety%20and%20HIT/Meeting%202/Palmer.pdf

The PPT's can be downloaded directly from these links.

I note several observations:

  • The overall quality of these presentations appears mediocre;
  • Issues of healthcare IT risks - as they exist on the ground in 2011 - are addressed poorly if at all;
  • Proposed "solutions" are really nothing novel or new compared to existing literature or recommendations made in earlier studies, including that of the US NRC;
  • That these presentations come from the highest scientific body in the United States is, in my opinion, a disappointment and, indeed, an embarrassment.

The IOM's rules of engagement, according to the Study Director, preclude my testifying, as a Medical Informatics specialist and former CMIO, about a relative's nearly being killed by poorly designed and implemented health IT. Instead, the linked presentations above are presented.

Here's an example of what I consider a somewhat rigorous and critical thinking-based presentation on health IT risks:

http://www.ischool.drexel.edu/faculty/ssilverstein/Clinical_IT_benefits_risks.ppt

I think the IOM should be able to do better than a mere small-university medical informatics adjunct professor.

-- SS