In my post "Will the U.S. spend the Economic Recovery Act's $20 billion for Healthcare IT more wisely than the UK?", I expressed concern that the answer to the titular question might in fact be "no."
In this third post of a series I present more information that stands as another exhibit to my thesis above, the first case of the series at this link being mine personally, the second case being of another person, an HIT consultant and former student in healthcare informatics.
This third case presents the perspective of a technical IT support person working for a vendor. He is young, yet has excellent observational and reasoning skills exceeding those of many "senior" people I have encountered.
This case came unsolicited. It is worth reading in its entirety for it speaks quite a number of truths. I have edited some of it out for length, but the full version appears here.
Congress, are you listening?
Case Three (emphases mine):
Hi MedInformaticsMD, I just finished reading your website on informatics, thought it covered a lot of the issues in clinical IT well, very well, one thing though that I thought was missing was the perspective of the vendor IT staff and how these issues trickle upstream (or is it downstream) to clinical IT.
... A brief background ... I am relatively young especially in the healthcare world. I have been working in healthcare IT for almost 3 years now and started with no knowledge of healthcare, my background was/is technical.
I have worked with several hospitals and groups of hospitals to create interfaces to practices (over 50 interfaces and over 15 different EMR/HIS/LIS systems), and in some cases other labs to share HL7 formatted clinical data. I have also been dabbling in (E)MPI's also and working on the design of a light-weight EMR application.
Although I do have limited healthcare experience I have somehow managed to become responsible for the design of these applications or have my hands somehow involved in these projects. Our staff, developers at the least have worked in the healthcare IT space for about 10-15+ years each.
I myself though am the odd one out with just under 3 years, so I spend most of my time learning from them, other industries and other developers to get a grasp of how to develop/design applications.
I for one feel the biggest problem with clinical IT, is as you stated, there's a "magical bullet" that can solve everyone's problem and IT is that bullet. This magic bullet idea seems to live on both sides, hospitals as well as vendors seem to think that IT will fix it all. The reality that both of us seem to see though is, it won't solve everything, it will solve some things but I think more importantly is that it can only help clinicians.
IT cannot solve all of clinician's problems because as you stated, healthcare is a complex, dynamic beast to tackle, rather than solving problems, I feel it would be best to use IT to make processes more efficient where applicable. Before I go too far though, I need to define people as I see them:
- Clinical people - Folks that understand the healthcare process down into the nitty gritty details, understand the complexity and dynamics of working in healthcare.
- Clinical IT people - These are folks like yourself, understand the clinical process and the IT side of things.
- IT people - These are folks like myself, we understand IT, development and the limitations of computers but don't quite understand all the details we can see the big picture but don't quite understand all the details and how they work in healthcare.
Internally at our company, much the same as with the C level people in hospitals, is that IT will fix [insert problem here]. The reality is though, it won't, if its not thoroughly examined and researched it won't solve the problem, or even worse the problem can't be solved by IT, similar to your ED whiteboard example.
We are currently developing an application where the client is driving the design, while management is trying to keep the application generic enough that we can implement this at other sites. The problem is, we have no expertise in that area, our developers, designers, implementers know and understand interfacing, not development in these other areas.
Management in their infinite wisdom though think we can do this as long as the hospital is giving input, though there's no clinicians involved in the design, just the IT managers of the hospital.
This seems to have become the bane of my employment though, management sees a revenue stream and then sends it down stream to the guys in the trenches to make it happen, the reality though is, we can make SOME of it happen, but not the entire solution, this seems to fall on deaf ears though.
Sales and management are those in a position of power to drive what the front-line soldiers do without understanding the problem. For example we, developers and implementations have been tasked with developing a light-weight MPI [master patient index - ed.], which has been a success though since this falls into some our staffs expertise.
But, the project was doomed to fail since day one because the expectations were to meet a 100% patient match without [additional -ed.] data entry. This was similar to solving CPOE, the last numbers I read were CPOE covers about 20% of total results/orders that exist today.
The expectations and smoke screen put by sales, marketing, vendors etc. seem to have grabbed everyone's imagination, I think what should be put out there is reality, unfortunately reality doesn't sell as well, which is a shame.
I've seen/heard of projects that range in the tens of thousands to multi-million fail because the expertise isn't there, management sees a glorious one-size-fits all application and bites the bullet only to see their 25 million dollar budget for the year go down the drain in one project and now dozens of other projects are in peril or go on indefinite hold because of the lack of budget.
My cousin works at a small hospital in admissions, they recently installed [major vendor's] application for patient registration. I was speaking with him about this and believe its a multi-million dollar failed project, [major vendor] doesn't internally have the staff to develop such an application, they didn't bring in clinicians, they didn't ask about their work-flow and designed an application that makes no logical (to a clinician or hospital employee) sense.
Options are scattered, the application expects a patient to be registered before being admitted, the problem is what happens in ED situations where the person goes in with no identification and is near-death, not quite easy to ask a person about to die their information.
That's the reality of hospitals, anything goes, [vendors should] develop the application to work in that manner. I also see another arena where IT has failed healthcare, the notion, the idea of EMR's is grand and could be a good one I believe.
... This boom in clinical IT is great, but, the intentions of companies jumping into healthcare overshadow the purpose of the applications. Companies are rapidly developing and deploying applications without ever asking a clinician if it works.
[Major new application] is a $25m dollar application, has many bells and whistles but lacks a fundamental core, how do you get data into it? It seems to be able to manage data well, can do many things, but at the end of the day if you can't get data into it its an expensive application that collects dust.
The revenue bug seems to have bitten many capable IT companies, unfortunately though they seem to have forgotten the fundamentals of simply asking, 'What exactly is it you need?' Vendors aren't entirely to blame of course, hospitals seem to want solutions but refuse, or can't put resources into creating a solution.
It's a pretty vicious circle of failed IT projects, lack of expertise that I think will keep going on until peoples perspectives of clinical IT are changed. If a viable solution is going to be created, the resources have to be put into it, the question I suppose is who is going to do it?
On a current project, we have several staff members of the client designing the application, internally we have several people designing the application, but, neither client nor designers understand each other. This project has gone on for about 2-3 months now, I started on it initially got pulled off and am now back on it again.
It's almost 3 months later and they're still discussing the items from day 1, there has been no progress in 3 months because neither side wants to put the resources into designing the appropriate application. The limitations on the technical side clash with the needs of the client, the capabilities of technology don't quite meet the clients needs, but no middle ground has been met, no answers and the same questions over and over.
Maybe one day some day, a group of folks will get together from all sides and sit down to nail things out, unfortunately in my few short years, I've seen egos and politics clash too often to think it'll happen any time soon.
... Implementation all too often lives on the bottom-line, whether they are forced to or not. At the end of day their job is secure depending on how productive they are, not how sound the core is, that's management and development's job right? Ours (I say our, because I am currently most active in implementations) is just to get it up running, working and be done with it, move onto the next job. Development and support can take care of the aftermath.
... There's a common theme among all three roles though, why worry about the fundamentals and the core when it's already 'good enough'. I think this is a mentality that needs to take a back seat, in my humble opinion anyway, the mentality should be, 'Does this work? can it be better?' Of course this has to be aligned with ROI (return on investments).
It seems to me anyway, that there has been so much focus on creating teams, departments and specializing people in specific areas that there are too few people to help bridge the gap. This is where I am hoping I can play a role some day anyway. To me, there's a lack of bridges between departments, fields of expertise and the client, making sense of everything isn't an easy task by any means, but it is a necessary one if an application/solution is to succeed and it seems to me its the first role to be dismissed.
It should be understood I am not "against" health IT nor a luddite. I completed a postdoctoral fellowship in Biomedical Informatics in 1994 out of love for the idea of improving healthcare through IT. It is clear, however, as these examples and others at my academic website "Common Examples of Healthcare IT Difficulty" illustrate, that significant further research is needed in order to determine how to best make this technology meet the needs of real world clinicians, and how to best implement it under real world conditions.
I do not advocate abandonment of health IT, only a return to the understanding that this technology remains largely experimental. This understanding was usurped in the past decade by an overaggressive and indeed opportunistic HIT industry, enamored of profit potential, and the irrational exuberance has now spread in a manner reminiscent of other speculative bubbles of late.
HIT should be treated as experimental, not as a drop-in panacea for healthcare's ills. In its present state is is perhaps as likely to exacerbate those ills. This is probably not technology that should be deployed en masse at present. We cannot afford as a society to learn how to do this by trial and error.
EHR today as a plug-and-play panacea that will save $80 billion per year? Or, $20 billion abyss?
I report, you decide.
Case one of this series is here. Case two is here.
-- SS
7 comments:
This is an all to common description of many IT projects taking place today in any environment. Vendors oversell, companies turn a blind eye to reality, and those down in the trenches, who really know what can and cannot be done, are ignored. To get a head you muddle along hoping the company will pay the up charges to fix what was in the original contract.
My attorney wife was pushed out of participating in an IT buildout because she would "ruin the deal." Friends work in very high end computer simulations and only when management gets out of the way do they and their customer counterparts develop what is really needed.
All the while when I see managers they complain about the lack of vision on the part of staffs and hearing the word 'No." Somehow it never occurred to them to ask if what they want can or should be done.
I once heard the definition of dumb is doing the same thing over and over and expecting a different outcome. There are a lot of dumb people in corporate IT positions.
Steve Lucas
Problem is, clinical settings should not be the experimental labs for "dumb people in corporate IT positions."
Legal liability of non medical IT leaders needs to be sought if IT harms a patient.
That may be the only way to get these people, their defective artifacts and their dysfunctional mismanagement and out of the clinic.
Agreed, to me this is a sad thing to have learned, but, it seems the only way as of late to get management to listen and respond accordingly is to let things fail, let things blow up and let them see and hear about it first hand before they come back and reassess things. Even so, they're still in the driving seat and call all the shots. Maybe the post above mine is right, they need to deal with the liability issues and maybe then will they start to listen...
Anonymous said...
Agreed, to me this is a sad thing to have learned, but, it seems the only way as of late to get management to listen and respond accordingly is to let things fail, let things blow up
The problem in this field is that such blowups don't just involve a missed payroll, but patient morbidity and mortality.
As I wrote, the clinic and hospital are no place for IT experiments and people learning about computers in medicine by trial and error.
These folks instead need to spend a good amount of time IN THE CLASSROOM before unleashing defective clinical tools (EHR, CDSS etc. are clinical tools that happen to reside on computers) on unwitting patients.
Sorry, maybe a correction, jeopardizing patients is definitely not my intent. When I say to let things fail or blow up these are entirely locked (sandboxed) inside of a test system. Using test data of as many possible scenario's as possible the basic issues arise, but as more test data comes through more issues arise and the concerns both from the folks in trenches and clients reach management ears.
It sounds hypocritical to let systems fail even in a sandbox, in hopes to get management to perk up long enough to ensure patient care is not jeopardized but it seems to have worked and has become an effective way to catch those deaf ears.
Thanks,
Sorry, maybe a correction, jeopardizing patients is definitely not my intent.
I understood that!
-- SS
I thinks this would be great to intense the health system in the said country IT is very useful for most of fields especially in Health care.
Post a Comment