Doctor, I’m not comfortable with that order

Dec 5, 2012, 9:18 AM, Posted by Mike Painter

Michael Painter Michael Painter

A little more than 13 years ago, the Institute of Medicine (IOM) released its seminal report on patient safety, To Err is Human.  You can say that again. We humans sure do err.  It seems to be in our very nature.  We err individually and in groups—with or without technology.  We also do some incredible things together.  Like flying jets across continents and building vast networks of communication and learning—and like devising and delivering nothing- short-of-miraculous health care that can embrace the ill and fragile among us, cure them, and send them back to their loved ones.  Those same amazing, complex accomplishments, though, are at their core, human endeavors.  As such, they are inherently vulnerable to our errors and mistakes.  As we know, in high-stakes fields, like aviation and health care, those mistakes can compound into catastrophically horrible results.  The IOM report highlighted how the human error known in health care adds up to some mindboggling numbers of injured and dead patients—obviously a monstrous result that nobody intends.

The IOM safety report also didn’t just sound the alarm; it recommended a number of sensible things the nation should do to help manage human error. It included things like urging leaders to foster a national focus on patient safety, develop a public mandatory reporting system for medical errors, encourage complementary voluntary reporting systems, raise performance expectations and standards, and, importantly, promote a culture of safety in the health care workforce. 

How are we doing with those sensible recommendations? Apparently to delay is human too.  We, of course, every year or so trot out campaigns and outrage about the patient safety problem.  We also have lots of programs and initiatives attempting to address safety—by institution—through federal agencies.  But we arguably have not credibly and systematically addressed the major recommendations in that report. We’re not even close.  And we still every single day have major safety problems in almost every aspect of U.S. health care. 

For example, we do not have anything like mandatory reporting of misses and near misses.  We did get those Patient Safety Organizations (PSOs) through the Patient Safety Act of 2005.  PSOs are a loose network of designated entities scattered sporadically across the nation to help gather information on a voluntary basis about some adverse events.  They submit that data to a website operated by the Agency for Healthcare Research and Quality (AHRQ).  That site is called, somewhat ominously, the “PSO Privacy Protection Center.” It’s not clear what, if anything, happens with that information.  It is clear, though, that the primary concern seems to be about protecting the privacy of the information, rather than using it urgently to address safety.  More recently, AHRQ requested permission to run a pilot program that would facilitate consumer, as opposed to professional, reporting of medical errors.  That experimental program is still under consideration. 

In 2009, the Robert Wood Johnson Foundation, through its Pioneer Portfolio, extended a two-year planning grant to a group interested in creating a public-private response to the health care safety challenge, similar to the Commercial Aviation Safety Team (CAST).  That group explored the possibility of creating a Public-Private Partnership to Promote Patient Safety (P5S).  As CAST does in aviation, the P5S would work to identify and mitigate safety hazards. The group found numerous barriers for such a health care partnership and so far has yet to find its national footing.  In health care, in spite of federal legislation and national attention, we nevertheless seem to be having a hard time even creating surveillance systems for reporting errors like the aviation industry has had for years—much less establishing collaborations to handle reported problems.

How about that “culture of safety”?  Have we aggressively pursued every possible avenue to ensure that health professionals, patients, and families feel comfortable and empowered to look for, find, talk about, and resolve safety problems?  Do most health professionals feel free to talk openly about mistakes and near misses with each other, as a team?  These questions are obviously rhetorical.  That’s unfortunate because this culture issue may be the linchpin to successful management of error in medicine.  We are collectively having a difficult time meeting these decade-old IOM recommendations, especially those requiring vast new data sources, reporting capability, and tricky collaborations.  Maybe we should instead look hard at the root of the problem—the human factor—our inherent propensity to err and the ways the professional culture handles that basic fact.

Paul Levy in several recent posts on his terrific Not Running a Hospital blog touches on these themes.  He focuses on Crew Resource Management (CRM), which is an approach to error prevention used in aviation that should have applicability in health care.  In those posts he cites an article that describes the use of CRM in the ICU.   

Those authors note that,

“[i]n aviation, non-technical skills, a blame-free environment and Team Situational Awareness (SA) are considered CRM core competencies that require specific and focused training.” 

Those same authors also observed:

“The archetypical medical specialist’s personality (highly motivated, A-type, control freak) helps in creating an environment in which a junior team member could feel inhibited to offer input in a senior team with ‘vertical’ leadership. This impacts Team SA, posing a threat to process safety, and thus patient safety.”

Their point, like the IOM’s, is that the human propensity to err is at the very core of our safety problems. 

What if a large part of the answer to our safety challenge is not more and more layers of technical capability?  What if, instead, this challenge first and foremost requires the basics—like attention to team skills, composition, function, and training?  What if we worked hard to teach all health professionals and help all patients and families to be observant, assertive, and vocal about mistakes and potential mistakes?  What if we deliberately created enlightened clinical environments in which we embraced our human frailties, rather than worked so hard to deny them? 

Although errors will always be part of our nature, they do not necessarily control our destiny.  Remember Pope didn’t just say, “To err is human.”  His full quote is important: “To err is human; to forgive, divine.”  It’s not so much the errors; we all make them.  Maybe it’s what we do together with those errors that ultimately matters most.