The authors conducted a cross-sectional study of urban hospitals in Texas using a "Clinical Information Technology Assessment Tool" (CITAT), a questionnaire designed to measure a hospital’s level of automation based on physicians' reported interactions with actual information systems.
They then examined whether greater automation of hospital information was associated with reduced rates of inpatient mortality, complications, costs, and length of stay for 167,000 patients older than 50 years admitted to responding hospitals between Dec. 1, 2005, and May 30, 2006.
Here is one of the study's findings as summarized in its abstract:
Results We received a sufficient number of responses from 41 of 72 hospitals (58%). For all medical conditions studied, a 10-point increase in the automation of notes and records was associated with a 15% decrease in the adjusted odds of fatal hospitalizations (0.85; 95% confidence interval, 0.74-0.97). Higher scores in order entry were associated with 9% and 55% decreases in the adjusted odds of death for myocardial infarction and coronary artery bypass graft procedures, respectively.
Having designed highly customized, detailed information systems for outcomes improvement and mortality and morbidity reduction in invasive cardiology, I am fascinated by suggestions of significant mortality risk reductions in Myocardial Infarction (MI) and Coronary Artery Bypass Grafts (CABG) related to usage of (non specialized) Computerized Physician Order Entry (CPOE) technology.
The authors acknowledge that there are many possible confounding variables in this study, which is based on surveys of physician health IT usage and hospital reporting data, not on far more robust randomized controlled trials. While I agree with the authors that followup validation of this cross sectional study's findings are needed, I do have a concern.
I am troubled by the implication of such a cardiology mortality reduction based on CPOE use, if real.
If this finding is real, one implication is that increased MI and CABG mortality in organizations *not* using CPOE are due to preventable errors of omission and commission in ordering. Importantly, these errors do not necessarily require expensive computers to correct. They can be corrected through human means.
While this reduced cardiology mortality association sounds possibly spurious on the basis of this implication, in my mind this is an alarming finding, potentially meriting prompt and comprehensive investigation.
After a possible "VIOXX moment" is discovered, just how long do we as a society wait before conducting a more thorough investigation?
Finally, the following question also arises. Do observational studies of HIT, subject to confounders and false conclusions of causality regarding associations, possibly create more problems than they solve? For example, the "red flag" described above? Are such studies - as opposed to robust controlled clinical trials - akin to unnecessary medical testing that finds anomalies and "unidentified bright objects", resulting in more fritter that wastes time and money?
I do not know the answer to this question, but I do tend much more towards robust HIT evaluation studies. One reason is that significant money is about to be poured into HIT.
I feel it's best we actually know what we're doing when $20 billion has just been queued up to be handed out for HIT. Some of it will go to good people, but also a significant amount will go to pre-Flexner style electronic snake oil salespeople in vendor organizations and hospitals, who will squander the funds on preventable IT misadventure. Let the Joint Commission Sentinel Event Alert on HIT and the National Research Council report "Current Approaches to U.S. Health Care Information Technology are Insufficient" be my witness.
(I'm not confident critical thinking people such as myself who have not succumbed to irrational exuberance over HIT will see any of that $20 billion, because we actually know what we're doing and don't suffer health IT malpractice and mal-practitioners easily.)