The Joint Commission for accreditation of healthcare organizations in the U.S. has issued a Sentinel Event Alert (link) on the dangers of poorly designed or implemented healthcare information technology (electronic medical records, computerized physician order entry, clinical decision support etc.)
What I and a number of others have been writing about for the past decade - in a contretemps to the prevailing attitude of "irrational exuberance" about healthcare information technology, especially under traditional information systems leadership, as a silver bullet to cure healthcare's ills - has appeared from an organization that might actually be listened to.
Listened to, that is, over the siren calls of the Big Business consortia, the IS guys in hospitals, the Information Technology industry, the former CEO of Intel, Bill Gates, and maybe even Oprah that ignore the downsides of healthcare IT and make it seem easy as 1-2-3 ...
I also believe this Joint Commission alert supports the contention of a number of people in my field that leadership by formally trained and experienced Biomedical Informatics professionals should be, when possible and when available, the "Sine qua non" of healthcare informatics initiatives. (I should also include those without formal biomedical informatics training but with much experience in the research and methods of the field, and a track record of applied success.)
Human computer interaction, unintended consequences of healthcare information technology, and other sociotechnical matters are areas we study professionally, combined with actual medical experience. These are areas not best left to ad hoc or on-the-job training, or to "amateurs" in business IT or medicine (amateurs in the sense that I am a radio amateur or ham with some technical experience, but not a telecommunications professional).
Several of my comments on the Alert appear below in bracketed [red italic]. Emphases in boldface are mine:
From the Joint Commission, a new Sentinel Events alert on HIT
Safely implementing health information and converging technologies (excerpts)
Dec. 11, 2008
As health information technology (HIT) and “converging technologies”—the interrelationship between medical devices and HIT—are increasingly adopted by health care organizations, users must be mindful of the safety risks and preventable adverse events that these implementations can create or perpetuate.
Technology-related adverse events can be associated with all components of a comprehensive technology system and may involve errors of either commission or omission. These unintended adverse events typically stem from human-machine interfaces or organization/system design [a.k.a. the cognitive academic "soft stuff" that Management Information Systems personnel often scoff at - ed.]
The overall safety and effectiveness of technology in health care ultimately depend on its human users, ideally working in close concert with properly designed and installed electronic systems. Any form of technology may adversely affect the quality and safety of care if it is designed or implemented improperly or is misinterpreted. Not only must the technology or device be designed to be safe, it must also be operated safely within a safe workflow process.
... There is a dearth of data on the incidence of adverse events directly caused by HIT overall. [This has been a chronic problem partly due to, I believe, corporate and marketing control of the HIT narrative, and ‘political correctness’ of the healthcare and academic communities, at patient expense - ed.] The United States Pharmacopeia MEDMARX database includes 176,409 medication error records for 2006, of which 1.25 percent resulted in harm. Of those medication error records, 43,372, or approximately 25 percent, involved some aspect of computer technology as at least one cause of the error. Most of the harmful technology-related errors involved mislabeled barcodes on medications (5 percent), information management systems (2 percent), and unclear or confusing computer screen displays (1.5 percent). The remaining harmful errors were related to dispensing devices, computer software, failure to scan barcodes, computer entry (other than CPOE), CPOE, and overrides of barcode warnings. (See the sidebar for a breakdown of these data.)
... Contributing factors
Inadequate technology planning [It would be extremely helpful in the Joint Commission explored how that happens, exactly - ed.] can result in poor product selection [not to mention vendor favoritism and CIO conflicts of interest - ed.], a solution that does not adapt well to the local clinical environment, or insufficient testing or training. Inadequacies include failing to include front-line clinicians in the planning process [unbelievably, this is not uncommon. What manner of IT and executive personnel can make such imprudent decisions, the Joint Commission should ask - ed.], to consider best practices [it is indeed puzzling that including clinicians in HIT is not a "best practice" obvious even to, say, our computer literate children - ed.], to consider the costs and resources needed for ongoing maintenance, or to consult product safety reviews or alerts or the previous experience of others [and the previous and current experience of Biomedical Informatics professionals, I might add - ed.]
... An over-reliance on vendor advice [in direct terms, and in words the Joint Commission will not use, such advice is likely tainted by conflict of interest - ed.], without the oversight of an objective third party (whether internal or external), also can lead to problems. “There’s often an expectation that technology will reduce the need for resources, but that’s not always true,” says Bona Benjamin, BS Pharm, director of Medication-Use Quality Improvement, American Society of Health-System Pharmacists. Instead, technologies often shift staffing allocations, so there is not typically a decrease in staff.
Technology-related adverse events also happen when health care providers and leaders do not carefully consider the impact technology can have on care processes, workflow and safety. “You have to understand what the worker is going through [It is unclear to me why it is assumed that non-biomedical IT personnel can truly do that - ed.] – whether that worker is a nurse, a doctor, a pharmacist or whoever is using the technology. The science of the interplay between technology and humans or ‘human factors’ is important and often gets short shrift,” says Ronald A. Paulus, M.D., chief technology and innovation officer, Geisinger Health System.
If not carefully planned and integrated into workflow processes, new technology systems can create new work, complicate workflow, or slow the speed at which clinicians carry out clinical documentation and ordering processes. Learning to use new technologies takes time and attention, sometimes placing strain on demanding schedules. The resulting change to clinical practices and workflows can trigger uncertainty, resentment or other emotions [and behaviors such as active and passive aggression - ed.] that can affect the worker’s ability to carry out complex physical and cognitive tasks.
For example, through the use of clinical, role-based authorizations, CPOE systems also exert control over who may do what and when. While these constraints may lead to much needed role standardizations that reduce unnecessary clinical practice overlaps, they may also redistribute work in unexpected ways, causing confusion or frustration. Physicians may resent the need to enter orders into a computer. Nurses may insist that the physician enter orders into the CPOE system before an order will be carried out, or nurses may take over the task on behalf of the physician, increasing the potential for communication-related errors.
Physicians have reported a sense of loss of professional autonomy when CPOE systems prevent them from ordering the types of tests or medications they prefer, or force them to comply with clinical guidelines they may not embrace, or limit their narrative flexibility through structured rather than free-text clinical documentation [and business IT and technical personnel often with insufficient human-skills levels may be assigned to "fix" the problems. Talk about being in over your head- ed.] Furthermore, clinicians may suffer “alert fatigue” from poorly implemented CPOE systems that generate excessive numbers of drug safety alerts. This may cause clinicians to ignore even important alerts and to override them, potentially impairing patient safety.
Read the whole thing.
Then compare to what I've been writing at my educational site on HIT difficulties here, founded in 1998.
It's about time.
I can add that the most critical "best practice" for healthcare organizations generally, and healthcare IS departments in particular regarding HIT, is the practice of knowing what you do not know, admitting you do not know it, finding out who does know, and then utilizing those experts maximally.
Addendum: I distributed the Joint Commission alert to a number of officials at my own organization, where I have apparently been deemed a "non team player" for standing against what I see as cavalier attitudes regarding biomedical informatics expertise, which led to a federal lawsuit against the EHR vendor and many other difficulties. (See my post "Do Healthcare Organizations Truly Want Electronic Health Records To Succeed?")
Sadly, here is one response I received from a very high official within the healthcare college. Simply:
"Please remove my name from your list serve."
As I wrote in my essay "Open Letter to President Barack Obama on Healthcare Information Technology", healthcare organizational leaders too often seem to think it's better we "all get along" than patients be protected:
Imagine for a moment my horror [in an ICU, as in the case here] of being unable to intervene due to administration's priorities of "everyone getting along" rather than the absolute protection of patients. Yet this is not an uncommon scenario in the IT backwater of hospitals.
That going through customary, bureaucratic, pathetically slow - and often ineffective - "channels" can trump getting patients protected from defective HIT systems is, in fact, a sign of a seriously debased and corporatized national healthcare culture.
Perhaps thanks to this Joint Commission Alert, we won't have to wait for cybernetic version of a "Libby Zion event", or for the following to occur until healthcare organizations finally "get it" regarding Biomedical Informatics and leadership of healthcare IT:
-- SS
Addendum Dec. 18:
Some have labeled me a "skeptic" of HIT. I'm not a skeptic at all. I've seen the technology work, in various settings. I've made it work in various settings, some rather unusual (e.g., in a Middle Eastern oil producing country where I once would not have been allowed to travel at all, let alone work on HIT for improving care of children).
I'm in fact a skeptic of the way health IT is currently pursued, especially its leadership model and costs based on a management information systems paradigm in design, implementation and lifecycle. HIT is not MIS, and pursuit of HIT as if it is MIS will cause continued difficulties, increased expense, and impaired diffusion.
I'm also a skeptic of the shroud of mystery and in fact a form of censorship that goes on towards its failures, failures largely caused by a 'Bull in a China Shop' approach to HIT. That approach is mediated by false assumptions and underestimations of HIT sociotechnical issues by an inappropriate leadership. That my academic website on HIT difficulties remains nearly unique after ten years,and that there is so little information on HIT difficulties on the web, is in many ways remarkable.
Done right, HIT can succeed. See for example the text "Medical Informatics 20/20" for "best practices" that mean something.
The Joint Commission report should, in fact, be unnecessary. Much of what it states is obvious. That its findings need to be stated as an "alert" at all is perhaps reflective of the above problems.
"Gadfly towards ill informed HIT leadership" might be a more precise term to describe me.
-- SS