Showing posts with label healthcare IT reporting. Show all posts
Showing posts with label healthcare IT reporting. Show all posts

Wednesday, December 19, 2012

A Significant Additional Observation on the PA Patient Safety Authority Report "The Role of the Electronic Health Record in Patient Safety Events" -- Risk

At a Dec. 13, 2012 post "Pennsylvania Patient Safety Authority: The Role of the Electronic Health Record in Patient Safety Events" I alluded to risk in a comment in red italics:

... Reported events were categorized by their reporter-selected harm score (see Table 1). Of the 3,099 EHR-related events, 2,763 (89%) were reported as “event, no harm” (e.g., an error did occur but there was no adverse outcome for the patient) [a risk best avoided to start with, because luck runs out eventually - ed.], and 320 (10%) were reported as “unsafe conditions,” which did not result in a harmful event. 

The focus of the report is on how the "events" did not cause harm.  Thus the relatively mild caveat:

"Although the vast majority of EHR-related reports did not document actual harm to the patient, analysts believe that further study of EHR-related near misses and close calls is warranted as a proactive measure."

It occurs that if the title of the paper had been "The Role of the Electronic Health Record in Patient Safety Risk", the results might have been interpreted far differently:

In essence, from from June 2, 2004, through May 18, 2012 (the timeframe of the Pennsylvania Patient Safety Reporting System or PA-PSRS database), from a dataset highly limited in its comprehensiveness as written in the earlier post, there were approximately 3,000 "events" where an error did occur that potentially put patients at risk.

That view - risk - was not the focus of the study.  Should it have been?

These "events" really should be called "risk events."

It is likely the tally of risk events, if the database were more comprehensive (due to better recognition of HIT-related problems, better reporting, etc.) would be much higher.  So would the reports of "harm and death" events as well.

That patient harm did not occur from the majority of "risk events" was through human intervention, which is to say, luck, in large part

Luck runs out, eventually.

I have personally saved a relative several times from computer-related "risk events" that could have caused harm if I were not there personally, and with my own medical knowledge, to have intervened.  My presence was happenstance in several instances; in fact a traffic jam or phone call could have caused me to have not been present.

What's worse, the report notes:

Analysts noted that EHR-related reports are increasing over time, which was to be expected as adoption of EHRs is growing in the United States overall.

In other words, with the current national frenzy to implement healthcare information technology, these "risk events" - and "harm and death events" - counts will increase.  My concern is that they will increase significantly.

I note that health IT is likely the only mission-critical technology that receives special accommodation regarding risk events.  "If the events didn't cause harm, then they're not that important an issue" seems to be the national attitude overall.

Imagine aircraft whose avionics and controls periodically malfunction, freeze, provide wrong results, etc., but most are caught by hyper-vigilant pilots so planes don't go careening out of control and crash.  Imagine nuclear plants where the same occurs, but due to hypervigilance the operators prevent a nuclear meltdown.

Then, imagine reports of these "risk events" - based on fragmentary reporting of pilots and nuclear plant operators reluctant to do so for fear of job retaliation - where the fact of their occurrence takes a back seat to the issue that the planes did not crash, or Three Mile Island or Chernobyl did not reoccur.

That, in fact, seems to be the culture of health IT.

I submit that the major focus that needs addressing in health IT is risk - not just confirmed body counts.

-- SS

Thursday, December 13, 2012

Pennsylvania Patient Safety Authority: The Role of the Electronic Health Record in Patient Safety Events

The Pennsylvania Patient Safety Authority has released a report "The Role of the Electronic Health Record in Patient Safety Events."  A press release is at this link, and the full report in PDF is at this link.  In the report, the Pennsylvania Patient Safety Authority analyzed reports of EHR-related events from a state database of reported medical errors and identified several major themes.

The report was prepared with the assistance of Erin Sparnon, Senior Patient Safety Analyst the ECRI Institute near Philadelphia.  The ECRI Institute is an independent organization renowned for its safety testing of medical technologies and reporting on same, and that "researches the best approaches to improving the safety, quality, and cost-effectiveness of patient care."  I've mentioned it and its bylaws in this blog in the past as a model for independent, unbiased testing and reporting of healthcare techonlogies.

Regarding the Patient Safety Authority:


The Pennsylvania Patient Safety Authority was established under Act 13 of 2002, the Medical Care Availability and Reduction of Error ("Mcare") Act, as an independent state agency. It operates under an 11-member Board of Directors, six appointed by the Governor and four appointed by the Senate and House leadership. The eleventh member is a physician appointed by the Governor as Board Chair.  Current membership includes three physicians, three attorneys, three nurses, a pharmacist and a non-healthcare worker.

The Authority is charged with taking steps to reduce and eliminate medical errors by identifying problems and recommending solutions that promote patient safety in hospitals, ambulatory surgical facilities, birthing centers and certain abortion facilities. Under Act 13 of 2002, these facilities  must report what the Act defines as "Serious Events" and "Incidents" to the Authority.

The Authority maintains a database of serious events and incidents:

Consistent with Act 13 of 2002, the Authority developed the Pennsylvania Patient Safety Reporting System (PA-PSRS, pronounced "PAY-sirs"), a confidential web-based system that both receives and analyzes reports of what the Act calls Serious Events (actual occurrences) and Incidents (so-called "near-misses").

Cutting right to the chase, the paper's summary:

As adoption of health information technology solutions like electronic health records (EHRs) has increased across the United States, increasing attention is being paid to the safety and risk profile of these technologies. However, several groups have called out a lack of available safety data as a major challenge to assessing EHR safety, and this study was performed to inform the field about the types of EHR-related errors and problems reported to the Pennsylvania Patient Safety Authority and to serve as a basis for further study. Authority analysts queried the Pennsylvania Patient Safety Reporting System for reports related to EHR technologies and performed an exploratory analysis of 3,099 reports using a previously published classification structure specific to health information technology. The majority of EHR-related reports involved errors in human data entry, such as entry of “wrong” data or the failure to enter data, and a few reports indicated technical failures on the part of the EHR system. This may reflect the clinical mindset of frontline caregivers who report events to the Authority.

Results:

... Reported events were categorized by their reporter-selected harm score (see Table 1). Of the 3,099 EHR-related events, 2,763 (89%) were reported as “event, no harm” (e.g., an error did occur but there was no adverse outcome for the patient) [a risk best avoided to start with, because luck runs out eventually - ed.], and 320 (10%) were reported as “unsafe conditions,” which did not result in a harmful event. Fifteen reports involved temporary harm to the patient due to the following: entering wrong medication data (n = 6), administering the wrong medication (n = 3), ignoring a documented allergy (n = 2), failure to enter lab tests (n = 2), and failure to document (n = 2). Only one event report, related to a failure to properly document an allergy, involved significant harm.

A significant "study limitations" section was included that addressed: 

  • Issues regarding reporting statutes of the PA-PSRS errors database; 
  • lack of awareness of EHRs as a potential contributing factor to an error;
  • limitations of narrative reporting affecting both the types of reports queried and the tags applied (the study used textual data mining methodolgies);
  • query design of the study; and
  • the need for further refinement of the machine learning tool used in creating the working dataset, which may have missed relevant cases.

Some of these impediments to knowing the magnitude of extant HIT issues are also present in the 2008 Joint Commission Sentinel Events Alert on HIT, the 2010 FDA internal memorandum on HIT Safety, and the 2011 IOM report on the same topic.

(The IOM report specifically observed that the "barriers to generating evidence pose unacceptable risks to safety.") 

The major obstacle to this study in my view, though, was the nature of the dataset.  The database is for general reporting of medical errors, and it contains no specific fields or reminders about EHRs or the known ways in which they can contribute to, or cause, medical mistakes.  

The attempt was made, as acknowledged in the study, to glean information about EHR-related events from, in large part, textual analysis of narrative in the hopes that the reporter recognized the role of IT, and reported it using terms that could be detected by the search algorithms.  In other words, the data was not "purposed" for this type of study.  

It is axiomatic that one cannot find data that is simply not present, no matter how fancy the search algorithm.  Further, passive analysis of clinical IT risk/harms data in an industry where lack of knowledge of causation and misconceptions abound will produce only partial results that suggest further study is needed, and not give an indicator of just how incomplete the results are.

Thus, this cautionary statement was made in the new PA Patient Safety Authority report:

"Although the vast majority of EHR-related reports did not document actual harm to the patient, analysts believe that further study of EHR-related near misses and close calls is warranted as a proactive measure." 

My comments:

The report is welcome.

The most important part of the paper, I point out, is the “Limitations” section. FDA, IOM and others have made similar observations – we don’t know the true magnitude of the problem due to systematic limitations of the available data. 

Therefore, at best what is available must be deemed as risk management-relevant case reports, a “red flag” that could represent (using the words of FDA CDRH director Jeffrey Shuren regarding HIT safety), the tip of the iceberg.

It is imperative far more work be done in post-market surveillance as this technology is deployed nationally and internationally.  This is to ensure that good health IT (GHIT) prevails and bad health IT (BHIT) is either remediated or removed from the marketplace.  I had defined those in other writings as follows:

Good Health IT ("GHIT") is defined as IT that provides a good user experience, enhances cognitive function, puts essential information as effortlessly as possible into the physician’s hands, keeps eHealth information secure, protects patient privacy and facilitates better practice of medicine and better outcomes. 

Bad Health IT ("BHIT") is defined as IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation. 

An additional major factor that also contributes to lack of knowledge of EHR-related adverse events is hospital reporting non-compliance. For instance, I know of cases from my own legal consulting work and personal experience that I would have expected to appear in the database, but apparently do not.

But don’t take it from me alone. Here is PA Patient Safety Authority Board Member Cliff Rieders, Esq. on this.

From “Hospitals Are Not Reporting Errors as Required by Law, Phila. Inquirer”, pg. 4, http://articles.philly.com/2008-09-12/news/24991423_1_report-medical-mistakes-new-jersey-hospital-association-medication-safety:
  

... Hospitals don’t report serious events if patients have been warned of the possibility of them in consent forms, said Clifford Rieders, a trial lawyer and member of the Patient Safety Authority’s board.

He said he thought one reason many hospitals don’t want to report serious events is that the law also requires that patients be informed in writing within a week of such problems. So, if a hospital doesn’t report a problem, it doesn’t have to send the patient that letter. [Thus reducing risk of litigation, and, incidentally, potentially infringing on patients' rights to legal recourse - ed.]

Rieders says the agency has allowed hospitals to determine for themselves what constitutes a serious event and the agency has failed to come up with a solid definition in six years.

Fixing this “is not a priority,” he added.

This coincides with my own personal experience precisely.  In a case where my relative was permanently injured as a result of EHR-related medication error, and then died of the injuries, I never received the required report in writing from the hospital.  I also do not believe the case was reported to the Safety Authority, at least not as IT-related.

I suspect the true rates of EHR-related close calls, reversible injuries, permanent injuries and deaths is significantly higher than the limited data available suggests. That data is merely a red flag that much more education, stringent reporting requirements,  templates of known causes of error, and enforcement are needed.  (An April 2010  "thought experiment" on this issue I wrote about at "If The Benefits Of Healthcare IT Can Be Guesstimated, So Can And Should The Dangers" certainly suggested as much.)

Slides where I made those types of recommendations to the Patient Safety Authority, at a presentation I gave in July 2012 at their invitation, are at http://www.ischool.drexel.edu/faculty/ssilverstein/PA_patient_safety_Jul2012.ppt

A major concern I have is that the HIT industry will use this new report in a manner that ignores its limitations.

(Disclosure: I was an invited reviewer of this new PPSA report.)

-- SS 

Addendum Dec. 13:   

Also worth review is "Patient Safety Problems Associated with Heathcare Information Technology: an Analysis of Adverse Events Reported to the US Food and Drug Administration", Magrabi, Ong, Runciman, and Coiera, AMIA Annu Symp Proc. 2011.  

Data here came from FDA's voluntary (i.e., also tip of the iceberg) Manufacturer and User Facility Device Experience (MAUDE) database.  Ironically, the study was done in Australia using Australian grant funds.

-- SS

Monday, November 22, 2010

EHRevent.org CEO Edward Fotsch MD: The Real Challenge with EHRs is -- User Error?

Additional detailed answers to the questions I raised here and here about a new site EHRevent.org, for reporting of healthcare IT-related medical errors, can now be found at a HIStalk interview entitled "HIStalk Interviews Edward Fotsch MD, CEO, PDR Network (EHR Event)" at this link.

It is an interesting interview. I certainly find the recognition of need for an EHR/clinical IT problems reporting service a major cultural advancement in healthcare.

It's still unclear to me how -- and why -- this organization originated with little to no public knowledge and involvement, especially considering the organization types mentioned below that participated, and how it will function in interactions with myriad healthcare IT stakeholders.

Here's an explanation by Dr. Fotsch:

... We work with a not-for-profit board called the iHealth Alliance. They Alliance is made up of medical society executives, professional liability carriers, and liaison representatives from the FDA. They govern some of the networks that we run, and in exchange for that, help us recruit physicians. Professional liability carriers, for example, promote our services that send drug alerts to doctors because that’s good and protective from a liability standpoint.

In the course of our conversations with them roughly a year ago, when we were talking about adding some drug safety information into electronic health records, we came across the fact that there were concerns from the liability carriers that there was no central place for reporting adverse EHR events or near misses or potential problems or issues with electronic health records.

[Translation: the carriers saw their losses potentially increasing as a result of litigation arising from EHR-related lawsuits, and decided to do something proactive- ed.]

They were interested in creating a single place where they could promote to their insured physicians that they could report adverse EHR events. Then it turned out that medical societies had similar concerns.

[That must have been one of the best-kept secrets on Earth considering the promotion EHR's have received as a miracle-working technology, and the lack of expression of concerns from those societies - ed.]

Rather than have each of them create a system, the Alliance took on a role of orchestrating all of the interests, including some interest from the FDA and ONC in creating an electronic health record problem reporting system. That’s how it came into play.

Our role in it, in addition to having a seat on the iHealth Alliance board, was really in network operations — in running the servers, if you will, which didn’t seem like a very complicated task. Since business partners we rely on for our core business were interested in it, it was easy to say yes. It frankly turned out to be somewhat more complicated than we originally thought [I predict they haven't seen anything yet; wait until they get knee deep into real world EHR issues - ed.], but now it’s up and available.


While I find the recognition of need for an EHR/clinical IT reporting service a major advancement, I am nonetheless troubled by certain statements made by Dr. Fotsch. They seem at odds with the theoretic and empirical findings of medical informatics, social informatics, human-computer interaction and other fields relevant for health IT evaluation, and/or seem to demonstrate biases about HIT. My comments are in red italics:

Fotsch:

… Probably what we’re seeing more often than not, the real challenge with EHRs like any technology, turns out to be some form of user error.

[What about contributory or causative designer error? – ed.]

“I didn’t know it would do that"

[Why did the user not know? Lack of training, poor manuals, or overly complex information systems lacking informative messages and consistency of control-action relationships, as an example? -ed]

... or “I didn’t know that it pre-populated that"

[Why did it pre-populate? Was that inappropriate for the clinical context, such as in this example?]

... or “I didn’t know I shouldn’t cut and paste"

[Then why did the software designers enable cut and paste, without some informative message on overuse, such as length of text cut and pasted?– ed.]

... or “I wasn’t paying attention to this"

[Perhaps due to distractions from mission hostile user interfaces? -ed]

... or maybe the user interface was a little confusing

[What is "a little confusing?" (Is that like "A little pregnant?) And why was it confusing? User intellectual inadequacy, or software design issues leading to cognitive overload? - ed.]

Actual software errors appear to be the exception rather than the rule as it relates to EHR events.

["Actual software errors" are defined as, what, exactly--? Loss of database relational integrity as a result of a programming error, as apparently recently happened at Trinity Health, a large Catholic hospital chain as reported in HIStalk? Memory leaks from poor code? Buffer overflows? What?]

That’s at least as I understand it.

[Understand it from whom? Hopefully not from me or my extensive website on the issues - ed.]


In summary, a "blame the user" attitude seems apparent. There appears to be little acknowledgment of the concept of IT "errorgenicity" - the capacity of a badly designed or poorly implemented information system to facilitate error, and of the systemic nature of errors in complex organizations to which ill-done IT can contribute.

These are concepts understood long ago in mission critical settings, as in this mid 1980's piece from the Air Force cited in my previously-linked eight part series on mission hostile health IT:


From "GUIDELINES FOR DESIGNING USER INTERFACE SOFTWARE"
ESD-TR-86-278
August 1986
Sidney L. Smith and Jane N. Mosier
The MITRE Corporation
Prepared for Deputy Commander for Development Plans and Support Systems, Electronic Systems Division, AFSC, United States Air Force, Hanscom Air Force Base, Massachusetts.

... SIGNIFICANCE OF THE USER INTERFACE

The design of user interface software is not only expensive and time-consuming, but it is also critical for effective system performance. To be sure, users can sometimes compensate for poor design with extra effort. Probably no single user interface design flaw, in itself, will cause system failure. But there is a limit to how well users can adapt to a poorly designed interface. As one deficiency is added to another, the cumulative negative effects may eventually result in system failure, poor performance, and/or user complaints.

Outright system failure can be seen in systems that are underused, where use is optional, or are abandoned entirely. There may be retention of (or reversion to) manual data handling procedures, with little use of automated capabilities. When a system fails in this way, the result is disrupted operation, wasted time, effort and money, and failure to achieve the potential benefits of automated information handling.

In a constrained environment, such as that of many military and commercial information systems, users may have little choice but to make do with whatever interface design is provided. There the symptoms of poor user interface design may appear in degraded performance. Frequent and/or serious errors in data handling may result from confusing user interface design [in medicine, this often translates to reduced safety and reduced care quality - ed.] Tedious user procedures may slow data processing, resulting in longer queues at the checkout counter, the teller's window, the visa office, the truck dock, [the hospital floor or doctor's office - ed.] or any other workplace where the potential benefits of computer support are outweighed by an unintended increase in human effort.

In situations where degradation in system performance is not so easily measured, symptoms of poor user interface design may appear as user complaints. The system may be described as hard to learn, or clumsy, tiring and slow to use [often heard in medicine, but too often blamed on "physician resistance" - ed.] The users' view of a system is conditioned chiefly by experience with its interface. If the user interface is unsatisfactory, the users' view of the system will be negative regardless of any niceties of internal computer processing.


I am not entirely happy when the CEO of an organization taking on the responsibility of being a central focus for EHR error reporting makes statements that are consistent with unfamiliarity with important HIT-relevant domains, as well as a possible pro-IT, anti-user biases.

For that reason as well as the other questions raised at my prior posts (such as the onerous legal contract and apparent lack of ability of the public to easily view the actual report texts themselves), I cannot recommend use of their site for EHR problems reporting.

I recommend the continued use of the FDA facilities until such time as a compelling argument exists to do otherwise.

-- SS

Addendum 11/28/10:

This passage ends the main essay at my site "Contemporary Issues in Medical Informatics: Common Examples of Healthcare Information Technology Difficulties" and is quite relevant here:

... An article worth reviewing is "Human error: models and management", James Reason (a fitting name!), BMJ 2000;320:768-770 (18 March), http://www.bmj.com/cgi/content/full/320/7237/768:

Summary points:

  • Two approaches to the problem of human fallibility exist: the person and the system approaches

  • The person approach focuses on the errors of individuals, blaming them for forgetfulness, inattention, or moral weakness
  • The system approach concentrates on the conditions under which individuals work and tries to build defenses to avert errors or mitigate their effects
  • High reliability organizations---which have less than their fair share of accidents---recognize that human variability is a force to harness in averting errors, but they work hard to focus that variability and are constantly preoccupied with the possibility of failure.

-- SS


Wednesday, November 17, 2010

Some answers about new site "EHRevent.org" for health IT and drug adverse event reporting - and a note on incendiaries

Some answers to the questions I raised here and here about a new site EHRevent.org, for reporting of healthcare IT and drug problems, can be found in a blog post at the site of Occam Practice Management at this link: http://www.occampm.com/blog/general/ehr-event-reporting/.

Its author, Michelle R. Wood, had noted this HC Renewal post. She researched some of the questions and wrote up her findings.

It is well worth a read.

I do have a small bone to pick with her post at Occam. She wrote:

"While HC Renewal occasionally borders on the incendiary side of things, Dr Silverstein posed some valid questions about a website that seem to have caught everyone by surprise..."

I maintain that the true incendiaries are fired by those we write about, those whose pronouncements and acts are "threats to health care's core values, especially those stemming from concentration and abuse of power."

Those 'incendiary' pronouncements and acts can indeed maim and kill (for example, as a relative of mine is now experiencing thanks to a commercial EMR 'mishap').

I may be more accurate to say we don't restrict ourselves to the confines of 'political correctness', that is, stunted discourse conventions that generally favor maintenance of the status quo.

As I wrote on that issue last year here in my series on mission hostile healthcare IT:

... Some have complained I am being "politically incorrect." At a time when our banks, major industries, investments, lifestyle and retirements have been seriously eroded by a combination of secrecy, incompetence, and criminal behavior on an unprecedented scale, I think such people need to get their priorities in order.

In his mantra "Critical thinking always, or your patient's dead", cardiothoracic surgeon Victor P. Satinsky, mentioned in earlier posts as my earliest medical mentor, did not include "but be polite about it" as part of the lesson.


On those pesky EMR curmudegons ... (click to enlarge)

-- SS

Addendum 11/17/10:

At the above Occam link Ms. Wood published my brief comment on this issue, and a thoughtful response. See the comment thread of her EHRevent essay.

-- SS