Skip to content

Improving Safety By Taking More Risks: Lessons from High Consequence Industries

2010/10/03

Photo courtesy of U.S. Army

Can safety be compromised by taking too few risks?  That’s the surprising finding of our work at Safety Across High-Consequence Industries (SAHI), a conference and collaborative network I have been a part of since 2004 as a researcher and conference steering committee member.  This interdisciplinary community is enriched by diverse viewpoints like those represented within the Center for Adaptive Solutions (CAS), which is why we’ve chosen Safety Across High-Consequence Industries as our theme for the CAS blog this month.

Safety Across High-Consequence Industries

SAHI was founded by Jeff Brown of   Klein Associates and Manoj Patankar of St. Louis University in 2002 to transfer knowledge of safety improvement theory and practice across industries, for instance from aviation to healthcare.  In addition to a conference, there is a SAHI learning network, a Federal Aviation Administration funded Center for Aviation Safety Research at St. Louis University, and a body of scholarship including our comparative review of 13 safety cultures across nuclear power, aviation, chemical pharmaceutical, construction, and aviation industries,[i] and our forthcoming safety culture[ii] book.

SAHI Defined

So, what do we mean by Safety Across High Consequence Industries?

*       “Safety” refers primarily to consumer safety (i.e. patients in healthcare; passengers in aviation) rather than workplace safety (as in the enforcement of OSHA regulations and preventing accidents in the workplace).  Other definitions of “safety” include psychological safety, an issue which emerges in research on the relationship between safety and risk that I will describe shortly.

*       “High-consequence industries” refers to industries in which accidents can be catastrophic, causing loss of life (aviation, chemical, nuclear, healthcare), disruption to society (oil and rail), and risks or threats to consumer safety (healthcare, food production).  Other industries where accidents can have disastrous results include financial services and government.  So far, though, we have focused primarily on aviation and healthcare.

*       “Across” means we emphasize systemic solutions to safety problems–solutions with high-leverage across multiple areas of an enterprise, network, social system, and  across industries.  We also focus on how to best transfer knowledge, experience, and practice from one sector (or area, or discipline) to another.

*       Finally, we focus on socio-technical solutions to safety problems.  For example, improving consumer safety by improving both the “technical”  system (such as software platforms and information exchange through technology) and the “social” system (through training, learning, and development programs for workers).

A Central Problem:  Transfer and Adoption of Best Practices

At first glance, the transfer and adoption of best practices would seem to be a relatively straightforward issue.  After all, who wouldn’t want to follow “best practices?” It turns out that there are many impediments to the adoption, spread, and wider dissemination of practices which have been shown to improve results, even  “evidence-based” best practices within one sector (or discipline), let alone across sectors.

In healthcare, “bench” to “bedside” adoption of best practices often takes about 14 years to achieve mainstream adoption (e.g. Peter Angood, Joint Commission, SAHI Advisory group meeting, September 2007).   In interviews my colleagues and I conducted with seven of Canada’s leading health researchers and practitioners in 2008, adoption of widely-acknowledged best practices (such as the Ottawa Ankle Rules) achieved only 30% penetration, and this is at the home institutions where the practices were developed and tested!

Unless we are willing to live with a 14-year lag time and 30% adoption rates in healthcare, we must understand what accounts for this and what can we do about it.

An Enterprise-Wide Safety Improvement Roadmap

Over the last ten years, my colleagues and I have developed a roadmap for enterprise-wide safety improvement and have tested it against case studies from high-consequence industries in aviation, healthcare, nuclear, chemical, and oil.  We have found that accident levels in these industries are correlated with safety culture, organizational effectiveness and efficiency, and overall performance.

We found a continuum from Normal to Reliable to Highly Reliable to Ultrasafe safety performance which also correlates with other predictable outcomes, such as financial results.  This scale coincides with the percentage of defect-free rates associated with 2, 4, 6, and 7-9 Sigma performance levels, and with safety culture stages that we call Secretive, Blame, Reporting, and Just Culture[iii]

Normal Reliable Highly Reliable Ultrasafe

Resilient

Error Rates per million

2 Sigma

31%

4 Sigma

.62%

6 Sigma

.00034%

7 Sigma

.0000019%

Safety Culture

Maturity

Secretive Blaming Reporting Just

View of Risk

Avoid risk at all cost by:

Reducing variability in practice and enforcing  compliance with standard procedures and protocols.

Understand and learn from risk by:

Differentiating  between productive and unproductive variations in practice.  (3 Sigma)

Understand and learn from risk by:

Making these distinctions explicit and reviewing them together. ( 4 Sigma)

Testing  and refining their findings to date.  (5 Sigma)

Systematize and embed learning about risk by:

Formally incorporating this learning into ongoing processes of incident review, and continuously refining, testing, and learning.  (6 Sigma)

Accept risk as normal and expected;  anticipate risk and use it to innovate by:

Maintaining  the overall “safety envelope” ( no one is harmed by trials of new practices and procedures).

At the lower levels of performance (and safety culture maturity), the emphasis is on reducing variability in practice and enforcing compliance with standard procedures and protocols.  In this environment, “risk” is to be avoided at all cost.  At the upper levels of performance, though, risk is accepted as normal and expected; it is even invited in the service of anticipation and innovation, provided that the overall “safety envelope” is maintained (that is, no one is harmed by trials of new practices and procedures).  This is the world of resilience engineering as defined by  Erik Hollnagel and his colleagues, in which we examine what goes well and what is normal in an organization, as well as what goes wrong, in order to be better prepared to deal with surprising, risky situations when (not if) they come.

In the transition from a Reliable to a Highly Reliable enterprise (or system), it turns out that it is necessary to take more risk, not less.  This is  counterintuitive to many leaders, managers, and safety officers of high-consequence industries (such as healthcare) who are striving mightily to achieve and maintain Reliable performance levels.

In order to reach Reliable performance levels, safety leaders must rely on compliance, standardization, and rules which apply to everyone (with no exceptions).  But in order to achieve system-level safety performance at 6-9 Sigma, it is essential to develop two additional competencies throughout the organization:

  • The ability to take more risk in the service of discovering and adapting new and more effective practices and adapting to dynamic uncertainty; and
  • Situational awareness, or the ability to distinguish when to take less risk, and when to take more, without compromising the enterprise safety envelope.

The Key to Improving Safety:  Link it to Quality

The key to accomplishing this transition is linking the “safety” agenda to the “quality” agenda.  In brief, this means that at about the 3 Sigma level, enterprise managers should start telling the organization’s influencers to differentiate between productive and unproductive variations in practice.

At 4 Sigma, they should begin making these distinctions explicit and review them together.  At 5 Sigma, they should test and refine their findings to date.  At 6 Sigma, they need to formally embed this distinction in ongoing processes of incident review, refinement, testing, and learning.

Improving Safety Requires Taking On New Kinds of Risks

Navigating this kind of transition requires enterprise managers themselves to take new kinds of risks.  For instance, in the SAHI circle we have been looking at ways of transferring knowledge and experience of Crew Resource Management (CRM) from aviation to healthcare.  Early adopters have included national surgical leaders who have been active in the SAHI conference and learning network.

CRM is a form of team training and development which has been widely used with pilots and co-pilots and (to a lesser extent) air traffic control in the aviation sector.  To date, it has also been used with promising early results to improve communication in healthcare (see Nemeth, 2008), including hospitals in Boston and Pennsylvania.[iv]

These demonstration projects have required the surgical and hospital leaders to take several risks: introducing the idea of cross-disciplinary “team” training in environments where this has not been done before; asking physicians to make changes in practice (such as schedules changes to accommodate the needs of the wider team); and persisting with these changes over 12-18 months, despite the lack of compelling quantitative evidence that the changes are working.

The Biggest Risk: Shifting Ingrained Mind Sets and Behaviors

Perhaps the biggest risk in both the Boston and Pennsylvania cases was starting the process itself, which required the surgical lead to pull the team together and make sense of problems and opportunities which might motivate the team to make improvements:

“I remember sitting in the room and, as we all talked, I realized that everyone had a valid gripe; everyone in the room had valid concerns.  The real issue was we didn’t have a way for the team to function with multiple personnel substitutions during the procedure…this was the root of conflict within the team; as participants verbalized their frustrations they recognized that the problems experienced by each role were interrelated.” (“Safety Culture in Aviation and Healthcare”, by Patankar, Brown, Sabin, and Bigda-Peyton, ch. 6, in press).

This is the human side of transferring “best practices” across high-consequence industries, such as aviation and healthcare.  To move ahead with such projects, leaders must take more risk, not less.

This runs counter to their deeply-learned habits and assumptions, such as “in order to be safe, do not take risks.”  Instead, they must adopt a different view: “in order to be safe, accept that risk is normal and to be expected.  Welcome it, understand the difference between productive and unproductive risk-taking, and conduct real-time experiments and rehearsals in which the chances of harm are low and the chances of team and organizational learning are high.”

Conclusion: Shift from Risk Avoidance to Anticipate, Respond and Learn From Risk

By shifting from a focus on Reliability to an emphasis on Resilience, this paves the way for a wider shift in the team, organization, or enterprise to higher levels of safety performance under greater degrees of stress, turbulence, and dynamic uncertainty.


[i] Patankar, M., S., Bigda-Peyton, T., Sabin, E., Brown, J., & Kelly, T., (2005).  A Comparative Review of Safety Cultures. Federal Aviation Administration: ATO-P Human Factors Research and Engineering Division. (Grant No. 5-G-009)

[ii] Patankar, M. S., Brown, J.P., Sabin, E. J., Bigda-Peyton, T. G.  (In press). Safety Culture: Building and Sustaining a Cultural Change in Aviation and Healthcare. Aldershot, U.K.: Ashgate Publishing.

[iii] Patankar, M., S., Bigda-Peyton, T., Sabin, E., Brown, J., & Kelly, T., (2005).  A Comparative Review of Safety Cultures. Federal Aviation Administration: ATO-P Human Factors Research and Engineering Division. (Grant No. 5-G-009)

[iv] Patankar, M. S., Brown, J.P., Sabin, E. J., Bigda-Peyton, T. G.  (In press). Safety Culture: Building and Sustaining a Cultural Change in Aviation and Healthcare. Aldershot, U.K.: Ashgate Publishing.

Nemeth, C. (Ed.) (2008). Improving Healthcare Team Communication: Building on Lessons from Aviation and Aerospace.  Aldershot, U.K.: Ashgate Publishing.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: