A 21st century plague might be called the "Syndrome of Inappropriate Over-Confidence in Computing" (SICC syndrome for short).
It's bad enough that we are placing major decisions about industrial policy at the tender mercies of computer models. For example, whether global warming is real or not, dependence on computer models made by scientists who cannot reliably predict the weather, or exactly where an active hurricane under comprehensive observation from sky and satellite may strike, seems presumptuous at best. It reflects a syndrome of inappropriate overconfidence in computing, a belief in "cybernetic magic" if you will.
The same overconfidence, indeed, to the point of irrational exuberance, affects other domains. One domain is healthcare IT. As I shall point out, the SICC syndrome has helped cause major problems in other domains as well. Yet the appetite for yet more computer magic appears to be spreading.
As an aside regarding global warming due to man made causes, as a ham radio enthusiast I am a bit more concerned about what might be happening on good ol' Sol, that class G2 V main sequence dwarf star (the one we see in our sky every cloudless day). Said G2 star is having a problem with generating sunspots in the beginning of its current 11 year cycle, a regular cycle observed now for centuries:
Sunspot (Wikipedia): A minimum in the eleven-year sunspot cycle happened during 2008. While the reverse polarity sunspot observed on 4 January 2008 may represent the start of Cycle 24, no additional sunspots have yet been seen in this cycle. The definition of a new sunspot cycle is when the average number of sunspots of the new cycle's magnetic polarity outnumbers that of the old cycle's polarity. Forecasts in 2006 predicted Cycle 24 to start between late 2007 and early 2008, but new estimates suggest a delay until 2009.Emissions caused by sunspots profoundly affect earth's ionosphere layer and thus short wave radio propagation. Those wavelengths refract or "bounce" in the ionosphere and can thus travel beyond earth's curvature.
K7RA solar update: Last week's sunspot group was only visible for three days, December 10-12. The average daily sunspot number for all of 2007 was 12.8; if we see no sunspots for the rest of 2008, the average for this year will be 4.7. By comparison, the yearly averages of daily sunspot numbers during the last solar minimum (1995-1997) were 28.7, 13.2 and 30.7. This solar minimum is much lower than the one about 12 years ago.
The little mystery of the current solar clear complexion is in fact unexplainable by any science we know. Heaven knows what others effects are occurring as a result of solar mysteries such as this. The page on Sol linked above contains interesting theories on solar variations, but robust, reliable, predictive computer models for either man-made or solar-related climate change? Ha.
Regarding the SICC syndrome and healthcare IT, physicians and other clinicians at some point must realize their profession is being encroached on by one of the most arrogant occupations known to mankind, the business IT specialty (a.k.a. management information systems or MIS). A belief that mastery of IT in business, whose own track record of failure, waste and excess is far from stellar, gives one the expertise and authority to declare one's self an expert in issues deeply affecting healthcare is about as arrogant as it gets.
Worse, the arrogance is coupled with ignorance about decades of research in social informatics (the study of the social impacts of computing), observational studies, biomedical informatics and computer science research, etc. (A worst case scenario has occurred to me that people who gravitate towards business IT may lack the interpersonal skills and insights into human behaviors necessary to understand the aforementioned domains and their real world importance.) In any case, this encroachment on medicine by the business IT industry is an unwarranted, unparalleled power, territory and profiteering intrusion, a form of cross-occupational piracy.
Further, the gargantuan leap of faith -- with an almost religious fervor -- from health IT as a facilitating tool for clinicians to a tool that will "revolutionize healthcare" in the face of massive, recurrent, serious practical problems is another example of SICC. The recent Joint Commission sentinel alert on Healthcare IT (PDF) is the first formal, widespread acknowledgment of this issue by a healthcare regulatory agency with real clout anywhere in the world, as far as I know.
Those who have written on this issue of HIT risk when improperly designed and implemented have taken reputational hits as alarmists. I've been writing on these same points for at least the past decade, in fact, as have others who share my concerns. I wonder how many of those who critiqued the "alarmists" would after the JC Sentinel Alert now admit their brains were running on 3 cylinders, 80 octane and wishful thinking, while ours on 8 supercharged cylinders, 96 unleaded and reality based observation.
It is perhaps symptomatic of SICC that the recent Boeing 737 accident in Denver, even without loss of life, will be investigated far more thoroughly than all HIT failures combined.
I, for one, would welcome a cessation of claims that IT will "revolutionize" any field that depends primarily on cognition, such as biomedicine, and a return to more temperate attitudes instead of the almost bellicose grandiosity about HIT we see today. That is to say, that HIT - with proper contributions from the aforementioned specialties - will facilitate better health care, not "revolutionize" it.
I also wonder how many of those who critiqued these HIT concerns had a lot of money invested in the stock market's "sure bets" in recent years.
In a Wall Street Journal article about the recent history-making $50 billion+ Madoff financial fraud, "Former Mayor, Millions Lost, Tells How He Was Lulled" (Dec. 20, 2008, subscription required) I note the following:
[Former mayor of Fort Lee, N.J., Burt Ross,who once worked as a Wall St. stockbroker himself] says he remembers being puzzled about how Mr. Madoff was able to show positive returns, even in months when the stocks Mr. Madoff's fund owned were down.
He pushed such thoughts aside. "I thought, 'Who am I to question?'" Mr. Ross says. "This guy has a formula involving computerized trading....It's like Coke. We're not supposed to know the formula."
Mystery formula for computer trading, was it! SICC syndrome incarnate. I wonder just how many people lost their life's savings on similar delusions.
The syndrome has now spread to another critical agency, the FDA. At "Computer debacle: a Broken down process at the agency - or beyond?" I had written that:
I believe [the FDA's failures in building IT systems to track drug adverse events] represent more than a "broken down process at the agency." It's a "broken-down process" in the world of IT, i.e., the belief that IT is a homogeneous industry where expertise in business computing equips one to do all computing. I would be curious to know the backgrounds of those IT personnel who were involved in the leadership, planning and development of AERS II. I would bet most had a technical focus, and I would also bet none had expertise in medical informatics.Perhaps one day the drug industry, including the FDA, will accept the IOM's recommendations on medical informatics.
Those recommendations involved acquiring and empowering specialized people, not a resort to purported cybernetic miracles.
What does the FDA choose to do instead?
Resort to cybernetic miracles.
In "New Drugs, Virtual Tests" (Wall Street Journal, Dec. 17, 2008) we learn that:
The U.S. Food and Drug Administration plans to use new computer technology to simulate how some drugs in development are supposed to work, helping researchers and regulators spot safety and effectiveness issues before late-stage tests on humans are completed.
Entelos Inc., a Foster City, Calif., company that has developed the technology, said it will enable researchers to obtain computer-generated test results in a matter of days or weeks, compared with years required for most major clinical trials. Far more "simulated patients" also can be tested than in conventional human trials.
Under an agreement with the FDA that Entelos announced Tuesday, three drugs now being studied for heart-related conditions in large human trials will be tested by the simulation technology. Neither Entelos nor the FDA would disclose which drugs will be involved or which companies are developing them. The value of the contract also isn't being disclosed.
... "What this study is about is trying to anticipate bad scenarios before they occur," said Robert Powell, associate director in the office of translational sciences in the FDA's Center for Drug Evaluation and Research.
So, instead of reliance on building better capacity to sample data from the real world, highly speculative and (as far as I know) untested-by-clinical-trials "cybernetic miracle" simulations will be used to evaluate early drug candidates. (Disclaimer: I have no knowledge of or connections to this or any other company involved in such work. I discovered this issue as a result of reading the WSJ.)
Bad enough that insufficiently powered studies and even seemingly robust studies in domains with small effect sizes, financial interests and prejudices, and other factors may be misleading (see "Why Most Published Research Findings Are False", John P. A. Ioannidis, PLoS medicine, 2005 August; 2(8): e124).
Now computer models and in silico simulations that purport to model actual, immensely complex and poorly understood biologic and environmental factors sufficiently well to enable real world predictions will be used to influence clinical decisions. The same clinical decisions that might best made empirically in real biological systems - e.g., via in vitro and in vivo methods. That is a fantastic leap of faith. To the moon in a hot air balloon?
I note the Dec. 17 WSJ article does state that:
Mr. Powell [Robert Powell, associate director in the office of translational sciences in the FDA's Center for Drug Evaluation and Research] said regulators "wouldn't make a decision to kill a drug based on a simulation," but the findings could be used in discussions with drug companies to influence decisions such as the design of clinical trials. Eventually such information also could affect prescribing information included in drug labels.
Knowing the narrow minded, profit-motivated, often conflicted boneheads who have invested our futures in the likes of Mr. Madoff's Ponzi scheme, Fannie Mae and the like, hedge funds, profoundly arcane bundles of financial toilet paper known as "securities" based on the former sources, etc., I have very little confidence that regulators (and worse, non-scientist managers overseeing biomedical research budgets and portfolios such as here) will not use these "cybernetic miracles" in making decisions -- both pro and con -- on new drug entities and other matters.
I propose a new term, The "cyber-industrial complex" to describe these spreading SICC scenarios.
Snake oil salesmen of the 19th century had little on today's Cyber-Übermenschen.
-- SS
Your examples point to modeling and simulation as part of the Oversell, and I think this is valid to a very large extent. Simulations might provide guidance, and occasionally insight, but only in very rare cases are ever considered the final word on the subject. Certainly not by the people involved in performing them -- They know too clearly the implicit limitations. Unfortunately, and here's the rub, there are snake-oil salesmen in all areas, including research. And with federal research budgets the way they are (or perhaps there's an unsustainable number of researchers), there's definitely a tendency to promise miracles.
ReplyDeleteYet at the same time, I think it's pretty clear there _are_ fields depending "primarily on cognition" that depend fundamentally on computing technologies in some form (e.g., MRI, image processing, information retrieval); after all, simulation is a very small subset of the whole IT universe. Are you suggesting that, for example, these specific technologies did not revolutionize the practice of some aspects of health care?
I think not, but it's not clear and it seems like a rather broad brush is being used with respect to "IT." Not many people I know of would view computer modeling as any form of information technology; it's simply an activity that makes use of computers.
John wrote:
ReplyDeleteYet at the same time, I think it's pretty clear there _are_ fields depending "primarily on cognition" that depend fundamentally on computing technologies in some form (e.g., MRI, image processing, information retrieval); after all, simulation is a very small subset of the whole IT universe. Are you suggesting that, for example, these specific technologies did not revolutionize the practice of some aspects of health care?
I am not critiquing overconfidence in IT in mechanistic processes such as image production. I have great confidence that a properly performed CT scan is a reliable representation of inner structures of the body, subject to the constraints of radiation and imaging physics and other issues.
I am critiquing an overconfidence in the perceived capabilities of IT regarding relatively abstract, cognitive matters.
The commonality between computing done electronic health records, clinical decision support, CPOE etc. ("informational computing") with the computing done in CT, MRI etc. ("imaging computing") is the fact that both rely on computers.
While the latter technologies, relying on computers as very fast and sophisticated calculators, did enable visualization of structures and replace or modify certain types of interventions (e.g., the arthrogram), I don't think the term "revolutionize" is appropriate. "Incrementally improve some aspects of healthcare", yes. "Revolutionized them", no.
Invasive procedures are still the gold standard for many problems (e.g., solitary lung nodule), surgeons still operate, internists still prescribe meds, radiologists still read the CT's and MRI's subject to human error, radiation oncologists still bombard tumors with particles, internal medicine oncologists still administer chemo that causes complications, patients die, etc.
The use of computers as calculators to produce CT, MRI, and other sectional or 3D images certainly makes a certain type of image vastly more detailed, and possible regarding more areas of the body.
Tomography existed, though, long before computers and CT scanners, dating to the early years of radiography. When I was a radiology resident in the early 1980's I performed them, for example, for better visualization of renal problems. They arose out of ingenous orchestrations of moving a patient and an x-ray source to maintain best focus within a plane of the body. This, of course, required a very detailed understanding of radiation and imaging physics to achieve, and its designers did so with "wetware" alone.
Considering something "revolutionary" is perhaps a semantic issue; however, when we have a computer application that can reliably READ the CT's and MRI's and ultrasounds, and even better, then provide reliable information on treating the patient requiring no further human domain expertise, and/or then perform that treatment in an automated fashion, then I'll agree that something "revolutionary" has occurred.
Not many people I know of would view computer modeling as any form of information technology
The term "information technology" of necessity embodies computer hardware and software, since one would be useless without the other. Computer modeling, a process, is done with information technology.
Overconfidence in IT itself or in the use of IT to perform some function is, in my mind, a whimsical distinction with reference to the theme of my essay - specifically, an overconfidence in computing as per its title.
-- SS