- It's a rare event, it's just a 'glitch', it's teething problems, it's a learning experience, we have to work the 'kinks' out, it's growing pains, etc.
Or perhaps worse:
- Patient safety was not compromised (stated long before the speaker or writer could possibly know that).
What these statements translate to: any patient harm that may have resulted is for the "greater good" in perfecting the technology.
Here is the problem with that:
These statements, while seemingly banal, are actually highly controversial and amoral. and reflect what can be called "faith-based informatics beliefs" (i.e., enthusiasm not driven by evidence).
They are amoral because they significantly deviate from accepted medical ethics and patient rights, especially regarding experimentation or research such as the plain language of the Nuremberg code, the Belmont Report, World Medical Association Declaration of Helsinki, Guidelines for Conduct of Research Involving Human Subjects at NIH, and other documents that originated out of medical abuses of the past.
Semantic or legal arguments on the term "research", "experimentation" etc. are, at best, misdirection away the substantive issues. Indeed, for all practical purposes the use of unfinished software (or software with newly-minted modifications) that has not been extensively tested and validated and that is suspected to or known to cause harm, without explicit informed consent, is contrary to the spirit of the aforementioned patients' rights documents.
They are excuses from health IT hyper-enthusiasts ("Ddulites"), who in fact have become so hyper-enthusiastic as to ignore the ethical issues and downsides. The attitude gives more rights to the cybernetic device and its creators than to the patients who are subject to the device's effects.
These excuses are, in effect, from people who it would not be unreasonable to refer to as technophile extremists.
The Belmont Report of the mid to late 1970's, long before health IT became at all common, actually starts out with a section discussing "BOUNDARIES BETWEEN PRACTICE AND RESEARCH." I have updated one of the observations in that section to modern times:
... It is important to distinguish between biomedical and behavioral research, on the one hand, and the practice of accepted therapy on the other, in order to know what activities ought to undergo review for the protection of human subjects of research.
... When a clinician [or entire healthcare delivery system - ed.] departs in a significant way from standard or accepted practice, the innovation does not, in and of itself, constitute research. The fact that a procedure is "experimental," in the sense of new, untested or different, does not automatically place it in the category of research.
Radically new procedures of this description [such as use of cybernetic intermediaries to regulate and govern care - ed.] should, however, be made the object of formal research at an early stage in order to determine whether they are safe and effective. Thus, it is the responsibility of medical practice committees, for example, to insist that a major innovation [such as health IT - ed.] be incorporated into a formal research project.
Health IT appears to have been "graduated" from experimental to tried-and-true without the formal safety research called for in the Belmont report.
The Belmont report continues:
Research and practice may be carried on together when research is designed to evaluate the safety and efficacy of a therapy. This need not cause any confusion regarding whether or not the activity requires review; the general rule is that if there is any element of research in an activity, that activity should undergo review for the protection of human subjects.
Instead, what we have for the most part are excuses and special accommodations for health IT, on which the literature is conflicting regarding safety and efficacy, all the way up to the Institute of Medicine.