Case study: NASA

Overview

There are numerous examples online, and we’d encourage you to do a little searching. We’ve used some high profile studies to demonstrate Psychological Safety and Unsafety in stark relief. Once you’ve seen it in black and white, it becomes easier to see it in the more nuanced greys of a typical organisation.

In the next few pages you’ll find examples from NASA, Cycling New Zealand, and Boeing.

NASA

NASA provides two entries: Challenger and Columbia

CHALLENGER

In 1986, engineers for the space shuttle Challenger raised concerns about defects in the O-rings they used. In truth, they’d had concerns for years, seeing flaws in the O-ring performance. The O-rings are rubber gaskets that seal off the rocket’s boosters.

The risks were consistently minimised, even as O-ring problems surfaced in all but two of the nine launches in 1985. Special concerns were raised about how they would perform under cold temperatures. In the cold, the rubber stiffens and fails to seal effectively. Managers and engineers, and especially Roger Boisjoly, believed the risks could be fatal. They recommended delaying the launch. NASA continued, preparing to launch in winter.

Boisjoly and colleagues’ expert advice was ignored and they were personally disparaged.

On January 27, 1986, the O-ring failed in the right Solid Rocket Booster, allowing hot gases to shoot out, where they immediately burned a hole in the external fuel tank. 73 seconds after launch, 2 million litres of fuel ignited. Six astronauts and one teacher died.

COLUMBIA

17 years later, January 16, 2003, NASA launched space shuttle Columbia. A little after launch, during ascent, a large piece of insulating foam broke off the external tank and hit the left wing at up to 900km/h, damaging its thermal protection tiles. Engineers watching the footage a little later wanted more detail and images to determine damage before the shuttle would re-enter Earth’s atmosphere. At re-entry, temperatures are significant, and a compromised thermal protection system could lead to serious overheating and disaster.

NASA Engineer Rodney Rocha tried in vain for more information. However, previous foam issues hadn’t caused massive problems, there was a mission to complete, and requests for information were rescinded. Rocha was ignored.

On 1 February 2003, superheated atmospheric gases broke through the weakened protection system and destroyed the left wing’s structure. The orbiter disintegrated, and another seven astronauts died.

COLUMBIA ACCIDENT INVESTIGATION BOARD (CAID)

Within about 90 minutes of the Columbia explosion, the CAID was established. Their extensive review, compiled over seven months, was revealing, and is an almost textbook example of what a lack of Psychological Safety looks like.

Along with numerous technical analyses and recommendations, CAIB devoted a good portion of their work to NASA history (including, specifically, the Challenger disaster) and organisational cultural factors. In fact, they concluded (p12), that ‘NASAʼs organizational culture had as much to do with this accident as foam did‘. (You can read the very full report here.)

The CAIB report highlighted several cultural factors that impacted the Columbia incident:

  • The prevailing attitude around past successes (they landed on the moon, after all), and a famous “Can Do” attitude, discouraged dissent and speaking up. Nobody wanted to say “We can’t…”. The strong cultural bias and overly optimistic organisational thinking undermined effective decision-making and took the place of sound engineering practice. They acted as if they were invincible.
  • Near-misses, including previous foam incidents, were not seen or analysed as potential failures. This discouraged people from asking challenging questions and raising concerns.
  • Tighter and more demanding expectations around deliverables, with fewer and fewer resources to do the work (do more with less!), led to a focus on hitting targets at the expense of safety, reliability, testing, and attending to warning signs. There were several missed opportunities.
  • Efficiency demands created conflicting, confusing, goals, such as reducing the budget for safety while demanding high safety outcomes.
  • Organisational barriers prevented free exchange of information and shut down professional differences of opinion.
  • Disagreement was discouraged. Minority views were not considered.
  • Opaque decision-making had evolved from an informal chain of command and processes that operated outside NASA’s rules.
  • Silos existed across programme elements, with no management integration.

PSYCHOLOGICAL SAFETY TAKEAWAYS

As noted, it’s an almost textbook example of what a lack of Psychological Safety looks like. Among the many technical recommendations were items such as:

  1. Craft a sense of Vulnerability, not invincibility
  2. Ensure open and candid communications, up, down and across the organisation
  3. Develop the curiosity and culture of a learning organisation

This summary is an excellent description of how a Psychologically Safe organisation operates. Psychological Safety researcher Amy Edmondson discusses the Columbia disaster in this short video (external link) (1m41s).

Leave a Reply