Pages

Saturday, January 05, 2013

ONC and "Health IT Patient Safety Action & Surveillance Plan": When Sociologists Uphold the Hippocratic Oath While Physicians Pay Respect to the Lords of Kobol, We Are in a Dark Place, Ethically

[Note: this essay contains many hyperlinks. They can be right-clicked and opened in a separate tab or window.]

I've been meaning to write more on the just-before-Christmas, Friday afternoon, minimal-visibility release of the ONC report I'd written about in my Dec. 23, 2012 post "ONC's Christmas Confessional on Health IT Safety: HIT Patient Safety Action & Surveillance Plan for Public Comment."   (The ONC report itself is available at this link in PDF.)

The Boston Globe and Globe staff writer Chelsea Conaboy, however, have beaten me to the punch in the Jan. 3, 2013 article "Federal government releases patient safety plan for electronic health records", link below.

('Lords of Kobol', of course, is a pun.  They were fictional gods in a sci-fi series from the 1970's and a remake a few years ago, but in my circles the term is used satirically and derisively to reflect people expressing inappropriate overconfidence in - and perhaps worship of - computers.   Cobol, the COmmon Business-Oriented Language, is one of the oldest programming languages and was the major programming language of the merchant computing sector, including business, finance, and administrative systems for companies and governments.)

First, I do want to reiterate what I'd mentioned in my earlier post:  the new ONC report is a sign of progress, in terms of a government body explicitly recognizing the social responsibilities incurred by conducting the mass human subjects experiment of national health IT.  However, I also wrote:

... [The ONC report] is still a bit weak in acknowledging the likely magnitude of under-reporting of medical errors, including HIT-related, in the available data, and the issue of risk vs. 'confirmed body counts' as I wrote at my recent post "A Significant Additional Observation on the PA Patient Safety Authority Report -- Risk".

The Globe quoted a number of people involved the health IT debate, and I am now commenting on their Jan. 3 article:

Federal government releases patient safety plan for electronic health records
Boston Globe
01/03/2013 11:16 AM   

By Chelsea Conaboy, Globe Staff

The federal office in charge of a massive rollout of electronic health records has issued a plan aimed at making those systems safer by encouraging providers to report problems to patient safety organizations.

Though some in the field say it doesn’t go far enough, others said the plan is an important step for an office whose primary role has been cheerleader for a technology that has the potential to dramatically improve health care in the United States but that may come with significant risks.

A major issue at the heart of the controversy is the fact that, admittedly, nobody knows the magnitude of the risks - in large part due to systematic impediments to knowing.  This has been admitted by organizations including the Joint Commission (link), U.S. FDA (link; albeit in an "internal memo" never intended for public view, and discovered only through the hard work of Center for Public Integrity investigative reporter Fred Schulte when he was at the Huffington Post Investigative Fund), Institute of Medicine of the U.S. National Academies (link, quoted at midsection of post), and others. 

I have made the claim that when you don't know the level of harm of an intervention in healthcare, and there are risk management-relevant case reports of dangers, you don't go gung-ho and start a national-scale implementation with penalties for non-adopters, and then decide to study safety, quality, usability etc.  You determine safety first in more controllable and constrained environments.  Anything else is, as I wrote, putting the cart before the horse (link).


Things are a bit out of order here.


You also certainly don't dismiss risk management-relevant case reports from credible observers as "anecdotal", the common refrain of hyperenthusiasts and (incompetent) scientists who conflate scientific research with risk management - as a researcher from Down Under eloquently observed in the Aug. 2011 guest post "From a Senior Clinician Down Under: Anecdotes and Medicine, We are Actually Talking About Two Different Things."

 Back to the Globe:

A year ago, the Institute of Medicine issued a report urging the federal government to do more to ensure the safety of electronic health records. It highlighted instances in which the systems were linked to patient injury, deaths, or other unsafe conditions.

The report suggested creating an independent body to investigate problems with electronic records and to recommend fixes, similar to how the National Transportation Safety Board investigates aviation accidents.

Instead, the Office of the National Coordinator for Health Information Technology delegated various monitoring and data collection duties to existing federal offices, including the Agency for Healthcare Research and Quality [AHRQ].

The problem is that AHRQ is a research agency (as its name suggests), has no regulatory authority nor any experience in regulation, and most clinicians have never heard of it.  In effect, this ONC recommendation is lacking teeth, even compared to the relatively milquetoast recommendations of IOM itself (as I wrote about in a Nov. 2011 post "IOM Report - 'Health IT and Patient Safety: Building Safer Systems for Better Care' - Nix the FDA; Create a New Toothless Agency").


The [ONC] office has asked patient safety organizations, which work with doctors and hospitals to monitor and analyze medical errors, to add health IT to their agendas. Data from the organizations would be aggregated by the agency, but reporting by doctors and hospitals is completely voluntary.  [A prime example of what I term an extraordinary regulatory accommodation afforded the health IT industry - ed.]

Now we're into septic shock blood pressure-level weakness. Here is PA Patient Safety Authority Board Member Cliff Rieders, Esq. on mandatory, let alone voluntary reporting. From “Hospitals Are Not Reporting Errors as Required by Law", Philadelphia Inquirer, pg. 4, http://articles.philly.com/2008-09-12/news/24991423_1_report-medical-mistakes-new-jersey-hospital-association-medication-safety:
  


... Hospitals don’t report serious events if patients have been warned of the possibility of them in consent forms, said Clifford Rieders, a trial lawyer and member of the Patient Safety Authority’s board.

He said he thought one reason many hospitals don’t want to report serious events is that the law also requires that patients be informed in writing within a week of such problems. So, if a hospital doesn’t report a problem, it doesn’t have to send the patient that letter. [Thus reducing risk of litigation, and, incidentally, potentially infringing on patients' rights to legal recourse - ed.]


Rieders says the agency has allowed hospitals to determine for themselves what constitutes a serious event and the agency has failed to come up with a solid definition in six years.

Fixing this “is not a priority,” he added.

To expect hospitals to voluntarily report even a relevant fraction of mistakes and near-misses out of pure altruism, or permit their clinicians to do so, with the inherent risks to organizational interests such reporting entails, is risible.

The near-lack of reporting by most health IT sellers and hospitals in the already-existing FDA Manufacturer and User Facility Device Experience (MAUDE) database is substantial confirmation of that; the fraction of reports in MAUDE, however, are hair-raising.  See my Jan. 2011 post "MAUDE and HIT Risks: What in God's Name is Going on Here?" for more on that issue.

Here's an example of what happens to 'whistleblowers', even those responsible for system development and safety: "A Lawsuit Over Healthcare IT Whistleblowing and Wrongful Discharge."

ONC's recommendations thus in my opinion reflect bureaucratic window dressing, designed to create progress - but progress that can probably be measured in microns.

“There was no evidence that a mandatory program was necessary,” Jodi Daniel, the [ONC] office’s director of policy and planning, said in an interview.

Really?  See the aforementioned Philadelphia Inquirer article Hospitals Are Not Reporting Errors as Required by Law", as well as numerous articles on pharma and medical device industry reporting deficits such as starting at page 5 in my paper "A Medical Informatics Grand Challenge: the EMR and Post-Marketing Drug Surveillance" at this link in PDF.

There is no evidence mandatory reporting is necessary ... to someone who's either naïve, incompetent - or persuaded, e.g. with money, to not find evidence or rationale.

The [ONC] office has been under pressure to roll out the electronic health records systems quickly while protecting patient data and making sure that the systems don’t cause problems in medical care, said Dr. John Halamka, chief information officer at Beth Israel Deaconess Medical Center. 

Under pressure by the health IT lobby, perhaps; but nobody else that I can think of.

“It’s this challenging chicken-and-egg problem,” he said.

No, actually, it isn't.  Patient safety must come first. This becomes clear when one considers the late 5th century BC ethical principle Primum non nocere ("first, do no harm" or "abstain from doing harm") versus the late 20th and early 21st century IT-hyperenthusiast credo I've expressed as "Cybernetik Über Alles"  ("Computers above all"). Under CÜA, the computer has more rights than the patients, and the IT industry receives extraordinary regulatory accommodation to sloppy practices that no other healthcare or mission-critical non-healthcare sector enjoys.

I sent Dr. Halamka a set of arguments such as I make here, and a picture of a health IT 'chicken', my deceased mother in her death robes.

I received back a "thank you for the views" message - but no condolences.  (It occurs that I have rarely if ever received condolences from any senior HIT-hyperenthusiast Medical Informatics academic or government official to whom I've mentioned my mother.  Not to play amateur psychologist, but I believe it reflects the level of disdain or even hatred felt by these people towards health IT iconoclasts/patient's rights advocates.)

The plan, which is subject to public comment through Feb. 4, “is a reasonable start,” in part because it puts more pressure on hospitals and doctors to monitor safety, Halamka said.

As I expressed to Dr. Halamka, we are in agreement on that point.

The government would have risked stifling innovation in the industry if it had opted instead to require the kinds of tests and review by the Food and Drug Administration that new medical devices and drugs must go through, he said.

To that, I mention here (as I did in my email to him) my response to this industry meme, as I had expressed it at Q&A after my August 2012 keynote address to the Health Informatics Society of Australia:

... I had a question from the audience [after my talk], from fellow blogger Matthew Holt of the Health Care Blog.  (I've had some online debate with him before, such as in the comment thread at my April 2012 post here.)

Matthew asked me a somewhat hostile question (perhaps in retaliation for the thrashing he received at the end of my May 2009 post on the WaPo's HIT Lobby article here), that I was well prepared for, expecting a question along these lines from the seller community, actually.  The question was preceded by a bit of a soliloquy of the "You're trying to stop innovation through regulation" type, with a tad of Merck/VIOXX ad hominem thrown in (I ran Merck Research Labs' Biomedical libraries and IT group in 2000-2003).

His question was along the lines of - you were at Merck; VIOXX was bad; health IT allowed discovery of the VIOXX problem by Kaiser several years before anyone else; you're trying to halt IT innovation via demanding regulation of the technology thus harming such capabilities and other innovations.

The audience was visibly unsettled.  Someone even hollered out their disapproval of the question.

My response was along the lines that:

  • VIOXX was certainly not Merck at its best, but regulation didn't stop Merck from "revolutionizing" asthma and osteoporosis via Singulair and Fosamax;
  • That I'm certainly not against innovation; I'm highly pro-innovation;
  • That our definitions of "innovation" in medicine might differ, in that innovation without adherence to medical ethics is not really innovation.  It is exploitation.

I stand by that assessment.

More from the Globe article:

There is little good research into how the systems improve health care and there are big obstacles to fixing even the known problems, said Ross Koppel, a professor of sociology at the University of Pennsylvania who studies hospital culture and medication errors.

Some developers require providers to sign nondisclosure agreements before using their systems, and the safety plan does not prohibit such gag clauses.  [Note: I wrote on this issue here, and in a published July 2009 JAMA letter to the editor "Health Care Information Technology, Hospital Responsibilities, and Joint Commission Standards" here - ed.]  While the plan addresses reporting of known problems, Koppel said it will not help researchers and developers understand problems that go unnoticed but that may be causing real patient harm. 

“We only know the tip of the iceberg” about how electronic health records affect patient care, said Koppel, who was an official reviewer for the Institute of Medicine report.

As per the title of this blog post, we are in a dark place, ethically, when a PhD sociologist who's never taken the Oath of Hippocrates (to my knowledge) appears to express more concern for patient safety and patient's rights than a Harvard physician-informatics Key Opinion Leader such as Dr. Halamka.

Koppel said the mantra of the Office of the National Coordinator has been that more health IT leads to better health care. “It probably is better than paper,” he said, “but it could be so much better than it is.”

I agree, but with caveats.  I opine that bad health IT is likely worse for patients than a good, well-staffed paper based system.  For instance, the former can cause systematic dangers that even a bad paper system cannot, such as tens of thousands of prescription errors (see my Nov, 2011 post "Lifespan Rhode Island: Yet another health IT 'glitch' affecting thousands - that, of course, caused no patient harm that they know of - yet") or mass privacy breaches (see the current 30 or so posts on that issue at this blog query link: http://hcrenewal.blogspot.com/search/label/medical record privacy).

On good health IT and bad health IT from my teaching site "Contemporary Issues in Medical Informatics: Good Health IT, Bad Health IT, and Common Examples of Healthcare IT Difficulties" at http://www.ischool.drexel.edu/faculty/ssilverstein/cases/:

Good Health IT ("GHIT") is IT that provides a good user experience, enhances cognitive function, puts essential information as effortlessly as possible into the physician’s hands, keeps eHealth information secure, protects patient privacy and facilitates better practice of medicine and better outcomes. 

Bad Health IT ("BHIT") is IT that is ill-suited to purpose, hard to use, unreliable, loses data or provides incorrect data, causes cognitive overload, slows rather than facilitates users, lacks appropriate alerts, creates the need for hypervigilance (i.e., towards avoiding IT-related mishaps) that increases stress, is lacking in security, compromises patient privacy or otherwise demonstrates suboptimal design and/or implementation.
 

The Boston Globe article concludes:

Ashish Jha, associate professor of health policy at Harvard School of Public Health and a member of the panel that drafted the Institute of Medicine report, said he wants doctors to be able to report problems -- errors in medication lists, for example -- in real-time so they can be found and fixed quickly. The safety plan does not require systems to have that capability, but Daniel said her office could soon add such a requirement for products that receive federal certification.

The bigger problem is that health care as a whole needs a better way of tracking patient safety, Jha said. Monitoring issues caused by electronic health records “should be a part of it, and then we can actually know if this is a small, medium or large contributor to patient safety issues,” he said. “But we don’t know that.”

I agree with Dr. Jha, but the IT sellers and healthcare organizations will (legitimately) claim that adding real-time error reporting/forwarding to their products will be extremely resource-intensive.

I have an alternate approach that will require little effort on the part of the sellers and user organizations.

  • Post a message at the sign-in screen of all health IT along the lines that "This technology is experimental, adopted willingly by [organization] although not rigorously vetted for safety, reliability, usability, nor fitness for purpose, and thus you use it at your own risk.  If problems occur, report them to the following" ...

"The following" could include a list of alternatives such as I wrote in my Aug. 2012 post "Clinicians: How to Document the EHR Screens You Encounter That Cause Concern."


... When a physician or other clinician observes health IT problems, defects, malfunctions, mission hostility (e.g., poor user interfaces), significant downtimes, lost data, erroneous data, misidentified data, and so forth ... and most certainly, patient 'close calls' or actual injuries ... they should (anonymously if necessary if in a hostile management setting):

  • Inform their facility's senior management, if deemed safe and not likely to result in retaliation such as being slandered as a "disruptive physician" and/or or being subjected to sham peer review (link).
  • Inform their personal and organizational insurance carriers, in writing. Insurance carriers do not enjoy paying out for preventable IT-related medical mistakes. They have begun to become aware of HIT risks. See, for example, the essay on Norcal Mutual Insurance Company's newsletter on HIT risks at this link. (Note - many medical malpractice insurance policies can be interpreted as requiring this reporting, observed occasional guest blogger Dr. Scott Monteith in a comment to me about this post.)
  • Inform the State Medical Society and local Medical Society of your locale.
  • Inform the appropriate Board of Health for your locale.
  • If applicable (and it often is), inform the Medicare Quality Improvement Organization (QIO) of your state or region. Example: in Pennsylvania, the QIO is "Quality Insights of PA."
  • Inform a personal attorney.
  • Inform local, state and national representatives such as congressional representatives. Sen. Grassley of Iowa is aware of these issues, for example.


See the actual post for an idea about clinicians seeking indemnification when forced by healthcare organizations to use bad health IT.  I can attest to actually seeing HIT policies that call for "human resources actions" if clinicians refuse to use HIT, or cannot learn to use it at a sufficient pace.

(Left out of this reiteration is the demonstration on photographing problematic EHR screens.  See the post for the details - it is easy to do, even with a commodity cellphone.)

HHS should be promoting laws on protection from retaliation upon clinicians reporting problems in good faith.

Thus, physicians, nurses and other clinicians can create needed health IT transparency and help our society discover the true level of risks of bad health IT.  They simply need the right information on what to do and where to report, bypassing the ONC office and, in the spirit of medicine, taking such matters into their own hands in the interests of patient care and medical ethics.

I also made recommendations to the Pennsylvania Patient Safety Authority on how known taxonomies of health IT-related medical error can be used, and need to be used, to promote error reporting in common formats.  Slides from my presentation to the Authority entitled "Asking the Right Questions:  Using Known HIT Safety Issues to Improve Risk Reporting and Analysis", given in July 2012 at their invitation, are at http://www.ischool.drexel.edu/faculty/ssilverstein/PA_patient_safety_Jul2012.ppt

Finally, another sign of progress:  unlike the HITECH Act, this new ONC plan is open to public comment.

-- SS

Addendum Jan. 8., 2012:

Dr. Halamka has put more details regarding his views in his blog.  The entry is entitled "Electronic Health Record Safety" at this link:  http://geekdoctor.blogspot.com/2013/01/electronic-health-record-safety.html .

He writes:

... Some have questioned the wisdom of moving forward with EHRs before we are confident that they are 100% safe and secure.   [That, of course, is not my argument - nothing is ever 100% safe and secure.  However, we don't yet know just how safe and secure - or unsafe and insecure - HIT is.  That is the issue I am concerned about - ed.] I believe we need to continue our current implementation efforts.

I realize it is a controversial statement for me to make, but let me use an analogy.

When cars were first invented, seat belts, air bags, and anti-lock brakes did not exist.    Manufacturers tried to create very functional cars, learned from experience how to make them better, then innovated to create new safety technologies. many of which are now required by regulation.

Writing regulation to require seat belts depended on experience with early cars.

My grandmother was killed by a medication error caused by lack of an EHR.  My mother was incapacitated by medication issues resulting from lack of health information exchange between professionals and hospitals.   My wife experienced disconnected cancer care because of the lack of incentives to share information.     Meaningful Use Stage 2 requires the functionality in EHRs which could have prevented all three events.

I express my condolences on those events.

I disagree, however, with continuing national implementation efforts at the current rate, with penalties for non-adopters.  I opine from the perspective of believing health IT has not reached a stage where it is ready for national rollout and remains experimental, its magnitude of harms admittedly unknown and information flows systematically impaired.  I recommend and prefer great caution under those circumstances, and remediation of those circumstances before full-bore national implementation.

I will leave it to the reader ponder the two views.

-- SS

9 comments:

  1. It is scary that these instruments have been deployed in such a vast vacuum of knowledge on whether or not they do more harm than good.

    The medical literature has many examples of procedures, management strategies, and devices that were thought to be the cure all, but upon appropriate study, were found to cause more harm than good.

    Such is likely the case with most currently available HIT versions.

    Problem: neither vendors, Congress, HHS, ONC and innumerable financially conflicted would be researchers want to learn the truth.

    ReplyDelete
  2. The quote attributed to Jodi Daniel in the Boston Globe article, “There was no evidence that a mandatory program was necessary", is a gem. It's all the more outstanding when set to news that Penguin Books is re-issuing George Orwell's books with new covers.

    ReplyDelete
  3. Trevor3130 said...

    The quote attributed to Jodi Daniel in the Boston Globe article, “There was no evidence that a mandatory program was necessary", is a gem.

    Quotes like these from Daniel and Halamka make attention-getting slides in my presentations to Plaintiff's attorneys on HIT industry negligence.

    -- SS

    ReplyDelete
  4. Scot,

    Thank you for the extensive treatment of the Boston Post article. I also attempted to post a comment there but that function appears to be disabled. I will post the comment here instead.

    To whom it may concern,

    Earlier in the formation of Federal HIT policy, there was an untested presumption that HIT would reduce health care waste, fraud, and abuse. ONC, with Ms. Daniel closely involved, chose to study that question and it was duly reported in 2005 that it was deemed likely that initially HIT would actually increase fraud. ONC then commissioned a follow-on study, published in 2007, providing recommendations how to reduce that harm. It was on that study that I met Ms. Daniels and, from that, we have periodically touched base on when those recommendations might be taken up meaningfully. ONC statements to the contrary, none of the recommendations have been included in ONC actions for the 5 years thereafter. Foundational requirements may appear in later stages of Meaningful Use, perhaps influenced by the growing evidence that indeed HIT is increasing non-productive costs, as reported by the Center for Public Integrity, the New York Times, the Wall Street Journal, and other news channels.

    So, even when the ONC asked the question about potential HIT-mediated harms to the Program Integrity of, and fraudulent claims against Federal programs like Medicare and Medicaid, and when it received recommendations for actions, ONC took 5 years-plus to act. Knowledge of HIT-mediated harms has not, to date influenced HIT policy.

    This also points out the fact that if there is no ONC study of patient harms from HIT, therefore it isn't because ONC did not have the capability of such a study. It is in part because apparently nobody at ONC chose or thought to ask questions about HIT-mediated patient harms, although the Institute of Medicine had previously (1999) reported on the appalling number of preventable deaths due to medical errors in their portrayal of an already dangerous industry. Apparently patient safety was not considered more materially important than Program Integrity and Waste, Fraud, and Abuse and, in that context, ONC can reasonably choose to say, "there was no evidence." Of course, this also requires ignoring a substantial body of peer-reviewed literature demonstrating such harms, for example the works from your article's cited Dr. Ross Koppel.

    (continued)

    ReplyDelete
  5. (continued from previous comment)


    Nonetheless, again, even when ONC did ask a question about worsening fraud as an unintended consequence, and even when told there would be harms (2005) and how to initially address them (2007), no action was taken for many years.

    As a glib and gratuitous side note, might this mean that, to be consistent, we should deregulate pharmaceuticals, food, toxic chemicals, air quality, nuclear waste, consumer safety in order to stop suppressing innovation. On the other hand, isn't the effect of regulation to also re-direct innovation to find better ways of improving safety (think air travel, think cars, think food) by giving the industry a level "floor" level of safety requirements that innovators must compete to meet in the most efficient and effective ways possible? (Wherein the Feds stand in to represent the otherwise invisible patient harms and deaths of tens of thousands of patients, to make sure that those widely distributed harms cannot be ignored?)

    While we look forward to that discussion, we also await a more sound explanation why, as a nation, we are conducting this massive experiment in information technology on 315 million human subjects. "Stifling innovation" is clearly disproven by the impact on even less risk-laden industries. Health care, on the other hand, as previously noted remains highly risk-laden as, according to that 1999 IOM report, health care errors are killing 50,000 to 90,000+ people a year.

    Here is another possible hypothesis: The major old-line vendors of HIT have to redesign their systems to incorporate patient safety (and usability, and records management competence, etc.) to their legacy systems. Old-line users of HIT, like major medical centers such as Dr. Halemka's, have to similarly upgrade. This will cost a lot of money and the longer that the transitions are delayed, and the more bad HIT installed by others, the less injury they risk to their businesses from superior HIT-enabled competitors and the less medical-legally "abnormal" these decisions appear.

    Another problem is that someone will also have to deliver the bad news to Congress that their previous direction to ONC, to get HIT "out there" ASAP, is having some other consequences that were not mentioned as risks in their previous briefs to Congress. It is always easier for a complex "group think" dynamic to persist than to re-assess and re-direct, especially when the current wrong direction has so many beneficiaries with dollars to amplify their voices, and with the harms done to those without such amplification.

    (continued)

    ReplyDelete
  6. (continued from previous)


    In any case, nobody will disagree with this statement: there will eventually be regulations that stipulate patient safety, fitness as records management systems, and competence in usability at the bedside. When this occurs, these will favor those health care organizations and those HIT vendors who can compete on the basis of actual patient benefit and organizational improvement. The policy question at hand today is how long will ONC (or FDA, or the market, or hospital/clinic attorneys, personal injury attorneys, etc.) delay this event all agree will occur: requiring fitness in HIT, requiring compliance to records practices decades old and Standards 5+ years old reflecting those pratices?

    In the meantime, how many billions in subsidies will taxpayers pay for, without discretion, both bad and good HIT? Apparently we are leaving it to the marketplace to determine when patient safety is a competitive advantage "nice to have".
    To return to the initial observation: It is known that ONC has a study on waste, fraud, and abuse it has yet to use aggressively and it has ben gathering dust since 2005. Hopefully ONC (or, more properly, its Congressional and Executive direction) will not wait for HIT to produce enough injury and death to worsen the IOM statistics for already-known dangers of US healthcare.

    Also, hopefully more physicians, nurses, and other clinical professionals, as well as those seeing the harms to organizations (coders, information managers) will communicate with ONC and through their professional organizations to stop HIT human expermentation while acting aggressively in their own organizations to mitigate harms and speed improvements.

    HIT Standards exist, so Federal regulators can expedite requiring compliance with EHR functional requirements already in place. For the vast majority of clinicians and clinical organizations, I embrace that they believe that they are acting in good faith. Hopefully the growing body of calls for transparency and evidence-based HIT will increasingly alert clinical organizations and clinicians to the massive risks others are insisting you take and that you just be quiet and follow orders.

    The ONC itself is directed by Congress to promulgate HIT with zero mandate for patient safety which they presume to be in the able hands of doctors, nurses, hospitals, clinics and thereater worked on the bodies of patients.

    For the many struggling to redirect Federal HIT policy back to normal, protecting citizens/patients, keep up the good works. To those at the bedside combating Federally subsidized harms, all we patients thank you.

    Sincerely,

    Reed D. Gelzer, MD, MPH
    Advocates for Documentation Integrity and Compliance

    ReplyDelete
  7. Reed, thanks for your comprehensive, well-thought-out comments.

    I have but two comments:

    On the other hand, isn't the effect of regulation to also re-direct innovation to find better ways of improving safety (think air travel, think cars, think food) by giving the industry a level "floor" level of safety requirements that innovators must compete to meet in the most efficient and effective ways possible?

    Agreed. Reckless experimentation is not innovation. Regulation helps keep innovations at a "floor level" of safety, as you state. Regulation focuses the mind.

    I've worked in regulated sectors. We knew our "innovations" had to pass review, just as someone seeking NIH money for experimentation knows they have to make a good safety case to get funded.

    Put more plainly, the blanket statement that "regulation impairs innovation" in a field like medicine is that of people who need a refresher course in ethics.

    For the vast majority of clinicians and clinical organizations, I embrace that they believe that they are acting in good faith.

    If they are acting in ignorance, then they may believe they are acting in good faith, but in reality they are acting recklessly.

    -- SS

    ReplyDelete
  8. In reading the comments I could not help but reflect on my own experience with IT projects. Put simply it is all about process not product. Sadly many in IT will even sabotage their own projects to extend the time line and thus income and profits.

    This was driven home a number of years ago when my attorney wife was excluded from a large statewide IT project. The Feds were less than polite in asking she be assigned to the project only to have their request denied.

    The problem was my wife was interested in a functioning finished product. The IT people wanted to extend the project for as long as possible and saw the Federal funding with our tax dollars as a never ending stream of money for their use.

    One example highlights the issue: I asked why they were not using a GUI drop down menu, common at the time, in their system instead of a code driven, every page has a different command and no two pages are interconnected. The answer from the 20 something State employed project manager was he expected to retire doing the upgrades and was looking to expand his staff.

    As long as those inside government see no reason to produce a finished safe product, we on the outside will be subject to a continuation of the same unsafe environment we face today.

    Steve Lucas

    ReplyDelete
  9. Steve Lucas said...

    could not help but reflect on my own experience with IT projects. Put simply it is all about process not product.

    I had written in my original webpages in 1998 or so that it appeared hospital IT depts. were more concerned with 'process' than with results. Even if the focus on 'process' harmed or killed people.

    To make projects move forward I had to remove the assigned IT persons and hire my own , keeping them independent of MIS, for example in invasive cardiology.

    The IT people wanted to extend the project for as long as possible and saw the Federal funding with our tax dollars as a never ending stream of money for their use.

    In business or mercantile computing that's one thing.

    Knowingly degrading clinical IT projects to assure continued funding/career advancement, while putting patients at risk, is a grossly negligent if not criminal act, I believe.

    -- SS



    ReplyDelete