Saturday, February 28, 2009

Information Technology Makes Healthcare Easier? Is This Industry Trying to Harm Patients? Part 6 of a Series

(Note: Part 1 of this series is here, part 2 is here, part 3 is here, part 4 is here, part 5 is here, part 6 is here, part 7 is here, and part 8 is here. 2011 addendums: a post that can be considered part 9 is here, part 10 is here.)

 This post is part 6 of a series on the stunningly poor human engineering of production healthcare IT from major vendors, in use today at major medical centers. These devices provide a decidedly mission hostile user experience, yet with an almost religious fervor are being touted as cybernetic miracles to cure healthcare's ills.

During the Sunday morning talk show "Roundtable" this morning, I saw an IBM ad touting the fact that they'd surpassed the petaflop mark (built computers that can perform one thousand trillion floating point calculations per second).

They touted how such computers will enable weather prediction, medical advances, solutions to social problems, and other cybernetic miracles. (Some of these miracles have been promised since the days of ENIAC in the late 1940's.)

Now, I am indeed amazed by such machines, and realize their value when utilized by competent domain experts overseeing equally competent analysts and programmers, ideally along with HCI (human computer interaction) experts who can help humans interact effectively with such computational high performance.

One would think, though, in a culture where we can perform one thousand trillion calculations per second on large machines, and where consumer machines can perform pretty fast as well that we could do better in the human interface to healthcare IT:

As of 2008, the fastest PC processors (quad-core) perform over 37 GFLOPS (Intel QX9775) [that's 37 billion calculations per second - ed.] GPUs in are considerably more powerful, for example, in the GeForce 8 Series the nVidia 8800 Ultra performs around 576 GFLOPS on 128 processing elements ... There are now graphics cards such as the ATi Radeon HD 4870X2 which can run at over 2.4 TeraFLOPS [2.4 trillion calculations per second - ed.]

Amazing. I used to support applications running on IBM POWER-based supercomputers far slower than 1000 trillion FLOPS for drug discovery at Merck, and chatted with those who used these supercomputers in molecular modeling, critical to drug discovery. Today's computing advances are indeed remarkable just a few years later.

Here is where my disappointment arises.

Health IT seems to be back in the TRS-80 days in terms of its failure to utilize these levels of computing power to create a mission friendly user experience and safer medical care.


HIT user experience: trapped in the TRS-80 era?


Below is how a major healthcare IT system forces a user to do even a simple task, changing the starting date and time for a common medication.Let's count the steps required for what used to require one step: putting the instrument below to paper.



Keep in mind, health IT is touted as improving the quality of healthcare, reducing errors, and reducing costs.

Step one in the world of HIT (a world, as I said, of MIS-inspired inventory systems, not clinical tools for use by clinicians) involves hunting for, and then clicking "frequency" on a scrolling list of possible "order details", seen on the left of the screen below:

(click to enlarge)


The user then clicks on the "ellipsis button" at upper right, which produces a popup to display the "standard times" for B.I.D. (twice a day) meds:

(click to enlarge)


On the popup subscreen is a list of "standard medication adminstration times" or SMAT's. Here the BID times are shown as 0900 and 2100 (9 AM and 9 PM).

To override these times, more screens, more clicks and more work is needed. The user must click on a "requested start date and time," which has defaulted by programmer edict to the CURRENT data and time. The user is warned in the "User Guide" about how to perform this function as follows-


CAUTION: If you do not change the [default] requested start data and time, the first dose will be scheduled for the current date and time.

The "current time" here being 11:28 AM as on left:


(click to enlarge)



Fantastic. The user must then change the requested start data and time to match the SMAT (standard medication administration time) for the first dose. Say they want the first dose to be given at 2100 (9 PM) tonight.

In the screen below, the user changed this to 6/5/2006 at 2100 (9 PM) to match the SMAT time. The first dose is to be given at 2100 (9 PM) on the current night.


(click to enlarge)


But they're not done yet!

After signing the order, the clinician making the change (often a nurse) will need to click the eMAR (Electronic Medication Administration Record) tab. They must verify something called the "eMAR time frame setting" to be certain that this "time frame" is set for something called the "clinical range." See below:

(click to enlarge)

This act comes with a warning in the instruction manual:


CAUTION: Never limit the eMAR to your shift or to today only. Limiting your view of the eMAR may cause you to inadvertently create medication errors. (!)

Inadvertently create medication errors? Really? Should be easy to fix with machines capable of trillions of calculations per second, no?

No. The user is advised that if the eMAR timeframe is not set to "Clinical Range", all they need do is:

Close the patient's chart, and log out of the health IT application. When you log on next time, the time frame will default to Clinical Range.

Simple, no? Something we all want our doctors and nurses to be doing day in, day out for just this one simple function, no?

Who designed such an interface? What kind of user experience is this, exactly? One designed to make clinical work easier?

Perhap the HIT designers might seek a brain transplant or some other procedure to restore brain function, and then utilize the amazing computational power of modern IT and the advances in biomedical informatics, computational lingustics, parsers (used in building computer program compilers), etc. to allow a user to type in a command with an easily learned syntax such as:

> change Amoxicillin bid 0900 2100 start 06/05/2006

and have the computer popup a verification window that asks:

Are you sure you want to change Amoxicillin to twice daily, 0900 and 2100 (9 AM and 9PM), starting Monday June 5 (Yes/no)?

Or are such undergraduate level computer science feats beyond what's "a good business case" for the HIT vendors, thereby banking on the cognitive diligence of healthcare providers to make up for HIT stupidities?

Are these vendors aware of fifty+ years of computer science, information science, biomedical informatics, HCI and other research? Who, exactly, is creating, testing and approving the clinician user experiences I am illustrating?

Clinicians as technophobes when we're talking about poor IT such as presented in this series? My a**.

In part 7 and on we shall start to see the "peek a boo, wild goose chase, go find your lab data" screens I've mentioned.

-- SS

3 comments:

Anonymous said...

Here's a question I have: If the interface for an eMAR was designed by a physician like yourself, how much difference would there be in another physician's workflow compared to your design? Is it fair to assume what works well for you (since you designed it) will work well for others, or would there be a variation?

(note: I work in healthcare IT, but in the desktop/infrastructure side of the house)

Anonymous said...

Hi Nick, great question, because it is the wrong question that gets asked all too often. You don't ask physicians to design a screen, they aren't UI designers. Instead UI designers study how physicians work then build a UI that represents the best approach. They test that design with real physicians and validate it is usable and functional.

Does Toyota build every option requested by all customers? No, they research this. Does Starbucks offer every custom drink possible? Nope, again it is researched. Do you give every doctor what they think they want in an EMR? Of course not.

InformaticsMD said...

Is it fair to assume what works well for you (since you designed it) will work well for others, or would there be a variation?

If done well, following principles known for decades, and refined with appropriate expert leadership before release, it's highly likely it will work well while minimizing clinician fritter and hair pulling. It's not rocket science.

On the other hand, misdesign by amateurs almost guarantees a poor user experience.

Illustration (OS zealotry aside):

How well does Mac OS X or Windows XP/Vista/7 work for you in doing your everyday IT chores?

-- SS