When Scot Silverstein’s 84-year-old mother, Betty, starting mixing up her words, he worried she was having a stroke. So he rushed her to Abington Memorial Hospital in Pennsylvania.
After she was admitted, Silverstein, who is a doctor, looked at his mother’s electronic health records, which are designed to make medical care safer by providing more information on patients than paper files do. He saw that Sotalol, which controls rapid heartbeats, was correctly listed as one of her medications.
Days later, when her heart condition flared up, he re-examined her records and was stunned to see that the drug was no longer listed, he said. His mom later suffered clotting, hemorrhaged and required emergency brain surgery. She died in 2011. Silverstein blames her death on problems with the hospital’s electronic medical records.
“I had the indignity of watching them put her in a body bag and put her in a hearse in my driveway,” said Silverstein, who has filed a wrongful-death lawsuit. “If paper records had been in place, unless someone had been using disappearing ink, this would not have happened.”
How can I say that? Because I trained in this hospital and worked as resident Admitting Officer in that very ED pre-computer. The many personnel in 2010 who were given the meds history by my mother and myself directed it not to paper for others to see, but to /dev/null.
Why can I say that? Because the hospital's Motion for Prior Restraint (censorship) against me was denied outright by the presiding judge just days before the Bloomberg article was published (http://en.wikipedia.org/wiki/Prior_restraint):
Prior restraint (also referred to as prior censorship or pre-publication censorship) is censorship imposed, usually by a government, on expression before the expression actually takes place. An alternative to prior restraint is to allow the expression to take place and to take appropriate action afterward, if the expression is found to violate the law, regulations, or other rules.
Prior restraint prevents the censored material from being heard or distributed at all; other measures provide sanctions only after the offending material has been communicated, such as suits for slander or libel. In some countries (e.g., United States, Argentina) prior restraint by the government is forbidden, subject to certain exceptions, by a constitution.
Prior restraint is often considered a particularly oppressive form of censorship in Anglo-American jurisprudence because it prevents the restricted material from being heard or distributed at all. Other forms of restrictions on expression (such as actions for libel or criminal libel, slander, defamation, and contempt of court) implement criminal or civil sanctions only after the offending material has been published. While such sanctions might lead to a chilling effect, legal commentators argue that at least such actions do not directly impoverish the marketplace of ideas. Prior restraint, on the other hand, takes an idea or material completely out of the marketplace. Thus it is often considered to be the most extreme form of censorship.
The First Amendment lives.
(I wonder if it irks the hospital that they cannot perform sham peer review upon me now that the censorship motion is denied. Sham peer review is a common reaction by hospital executives to "disruptive" physicians, but I have not worked there since 1987 and I no longer practice medicine.)
In the Bloomberg story Mr. Robertson wrote:
In my opinion this statement represents gross negligence by a government official. Ms. Daniel is unarguably working for a government agency pushing this technology. She makes the claim that "so far the evidence we have doesn't suggest significant risk" while surely being aware (or having the fiduciary responsibility to be aware) of the impediments to having such evidence.
From my March 2012 post "Doctors and EHRs: Reframing the 'Modernists v. Luddites' Canard to The Accurate 'Ardent Technophiles vs. Pragmatists' Reality" at http://hcrenewal.blogspot.com/2012/03/doctors-and-ehrs-reframing-modernists-v.html (yes, this was more than a year ago):
... The Institute of Medicine of the National Academies noted this in their late 2011 study on EHR safety:
... While some studies suggest improvements in patient safety can be made, others have found no effect. Instances of health IT–associated harm have been reported. However, little published evidence could be found quantifying the magnitude of the risk.
Several reasons health IT–related safety data are lacking include the absence of measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology. Another impediment to gathering safety data is contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT–related adverse events. These barriers limit users’ abilities to share knowledge of risk-prone user interfaces, for instance through screenshots and descriptions of potentially unsafe processes. In addition, some vendors include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., “hold harmless clauses”). The committee believes these types of contractual restrictions limit transparency, which significantly contributes to the gaps in knowledge of health IT–related patient safety risks. These barriers to generating evidence pose unacceptable risks to safety.[IOM (Institute of Medicine). 2012. Health IT and Patient Safety: Building Safer Systems for Better Care (PDF). Washington, DC: The National Academies Press, pg. S-2.]
Also in the IOM report:
… “For example, the number of patients who receive the correct medication in hospitals increases when these hospitals implement well-planned, robust computerized prescribing mechanisms and use barcoding systems. But even in these instances, the ability to generalize the results across the health care system may be limited. For other products— including electronic health records, which are being employed with more and more frequency— some studies find improvements in patient safety, while other studies find no effect.
More worrisome, some case reports suggest that poorly designed health IT can create new hazards in the already complex delivery of care. Although the magnitude of the risk associated with health IT is not known, some examples illustrate the concerns. Dosing errors, failure to detect life-threatening illnesses, and delaying treatment due to poor human–computer interactions or loss of data have led to serious injury and death.”
I also noted that the 'impediments to generating evidence' effectively rise to the level of legalized censorship, as observed by Koppel and Kreda regarding gag and hold-harmless clauses in their JAMA article "Health Care Information Technology Vendors' Hold Harmless Clause: Implications for Patients and Clinicians", JAMA 2009;301(12):1276-1278. doi: 10.1001/jama.2009.398.
FDA had similar findings about impediments to knowledge of health IT risks, see my Aug. 2010 post "Internal FDA memorandum of Feb. 23, 2010 to Jeffrey Shuren on HIT risks. Smoking gun?" at http://hcrenewal.blogspot.com/2010/08/smoking-gun-internal-fda-memorandum-of.html.
I also note this from amednews.com's coverage of the ECRI Deep Dive Study (http://hcrenewal.blogspot.
One wonders if Ms. Daniels' definition of "significant" is when body bags start to accumulate on the steps of the Capitol.
I also note she is not a clinician but a JD/MPH.
I am increasingly of the opinion that non-clinicians need to be removed from positions of health IT leadership at regional and national levels.
In large part many just don't seem to have the experience, insights and perhaps ethics necessary to understand the implications of their decisions.
At the very least, such people who never made it to medical school or nursing school need to be kept on a very short leash by those who did.