The Lessons of Chernobyl Are More Relevant Than Ever

diverse team collaborating
Twin crises have renewed our appreciation for open and honest communication, and especially for those who speak up when it matters most.

Share This Post

Mistakes that don’t get mentioned don’t get resolved. That’s when crisis hits.

In recent months, we’ve seen a pandemic wreak havoc on the world and a reinvigorated national conversation on racism. These twin crises have renewed our appreciation for open and honest communication, and especially for those who speak up when it matters most.

Speaking up is critical to both identifying problems and finding solutions. Let’s turn to a historic example to explore why, and what happens when listening and communication break down.

What happened?

On April 25, 1986, the crew at the Chernobyl nuclear complex in Ukraine were preparing to shut down the reactor for routine maintenance. When they did, they wanted to see how long the turbines would continue to supply power to the main cooling pumps after the station lost electrical power.

The test began and the reactor began to power down, but it was in an unstable condition. Things went wrong quickly. Reactor pressure spiked, resulting in two explosions that sent smoke and radioactive material from the core nearly a mile into the sky. A number of fires started around the plant and the reactor core was exposed, spewing radiation into the air.

The fires were quickly extinguished but it took ten days to bring the reactor temperature down enough to reduce the release of radiation. During that time, a massive amount of radioactive material was released. Within a few months, 26 people had died of radiation poisoning, including 6 firemen, and over 300,000 people had been relocated and resettled.

Why did it happen?

Chernobyl wasn’t an outlier. There had been warnings. An accident in Leningrad in 1975 and an earlier fuel accident at Chernobyl itself had revealed weaknesses in the reactor design and its operation procedures. However, the lessons learned from these accidents did not result in major changes in the design nor in the operation of the plants. For example, the staff at Chernobyl weren’t even aware of the nature and causes of the Leningrad accident. There was a lack of a safety culture in the entire system, which meant that individuals were not empowered to speak up when they saw something wrong.

The accident was also due to poor communication on the ground. Unfortunately, the crew doing the test and the crew responsible for the safety of the station didn’t consult and coordinate with each other properly before they did the test. As a result, the test crew didn’t take adequate safety precautions and the plant operating crew weren’t aware of the safety implications of the exercise. Once again, these mistakes were the outcome of a weak safety culture.

The test put the reactor into a dangerously unstable condition. And automatic shutdown safety mechanisms had been deliberately disabled. It was also reported that Deputy Chief Engineer Anatoly Dyatlov, who was supervising the test, had threatened to fire some plant workers if they didn’t proceed with it. So, even if members of the crew objected to what they were doing, there wasn’t anything they could do about it. This was not a culture which encouraged independent thought or challenging authority.

The Chernobyl disaster was the outcome of poor design, operator error and the lack of a safety culture. All of these issues may have been avoided or addressed if people were encouraged to speak up and not fear the consequences for doing so. In this case, the entire Soviet system itself helped create the conditions that allowed for this disaster.

How do you fix it?

Much of this may seem obvious in retrospect. How could they not have seen the flaws in their design? Why wouldn’t people speak up, no matter what, if what they were doing was so potentially dangerous? Why wouldn’t safety always be the number one concern in a nuclear plant? Group dynamics can be tricky and speaking up, no matter how crucial, can sometimes feel like staring into the mouth of a lion.

Luckily, there are solutions to help people at your organization speak up—to both identify the issues and find the solutions we need.

[action hash= “9633be2e-b56b-4778-9f3d-0627ff7c877d”]

Subscribe To Our Newsletter

More To Explore

Season 10

DE(A)I Part One: Mitigating Bias in Technology Adoption

In this special episode of Your Brain at Work, published to coincide with a presentation — delivered by Janet M. Stovall, our Global Head of DEI, and Matt Summers, our Global Head of Culture and Leadership — at the Society for Human Resource Management’s Talent Conference and Expo… they examine the emergence of AI through the lens of Diversity, Equity and Inclusion — this time focusing on breaking bias.

Ready to transform your organization?

Connect with a NeuroLeadership Institute expert today.

two people walking across crosswalk

This site uses cookies to provide you with a personalized browsing experience. By using this site you agree to our use of cookies as explained in our Privacy Policy. Please read our Privacy Policy for more information.