A five minute read.
Just as ninety percent of managers rate themselves as “better than average” communicators, put together a group of professionals, and a majority will say their problem-solving skills are among the best.
Often, though, this confidence isn’t borne from an ability to face the unknown and navigate a successful path through a murky situation – it comes from the memory of making it out alive the last time.
Those that view themselves as great problem-solvers are frequently those that have experience with a specific issue. They’ve “seen this” – or something very much like it – before. They are faster than average at recognizing the concern and can quickly recall how they got out of it. (Not necessarily “solved” it – or it wouldn’t be back, would it?)
Their challenge comes when facing something new. Or something that presents as a “normal” problem, but has all the hallmarks of a real corker. The danger is in knowing enough about a problem to not be afraid to tackle it, but not knowing enough to recognize how little is really known.
A textbook case of this features in HBO’s Chernobyl. The premiere episode perfectly captures the particular hubris of the boldest of cause-jumpers: calm, in control, smugly self-assured, but absolutely and dreadfully wrong. This is tragedy in its classic sense: Deputy Chief Engineer Anatoly Dyatlov, though well-intentioned, brings ruin upon himself (and half of Russia) because he refuses to see the truth.
Craig Mazin’s script highlights some classic Dangers of the Classic Cause Jumper.
Danger Number 1 – Believing Our Initial Instinct Is Correct, No Matter What Others Tell Us
If there can be a Most Classic of the “classic Dangers”, this is it.
What just happened?
I don’t know.
A third operator enters the control room.
Something blew up…
Dyatlov thinks through this.
The turbine hall… The control system tank… Hydrogen… You morons blew the tank.
Dyatlov didn’t get to be Deputy Chief Engineer without knowing a few things, and he knows that if there’s a fire in the hall, a tank exploded. That’s what has happened before, so that must be what’s happening now.
He instructs the operators to send cooling water into the core. But one of the operators tells him, “There is no core.”
He is dismissive: “You’re confused. Cores don’t explode.”
Dyatlov orders them to continue pumping water into the core. The one they (correctly) believe is gone. He says “we’re wasting time”, and they are – they’re doing that thing that won’t help, to the thing that isn’t there anymore.
Later, Dyatlov does some mental math to convince himself of his original diagnosis. Yet he asks if his instructions were followed, that cooling rods were lowered into the core. An operator, at a complete loss, tells him, “It’s gone.”
Dyatlov rages, tells the operator he’s delusional. When it’s pointed out that this operator is covered in radiation burns, he sticks with his diagnosis. “Ruptured condenser lines will do that. He will be fine. I’ve seen worse.”
As bureaucrats fill the conference room, another engineer describes seeing a piece of glowing graphite on the ground. Dyatlov tells him:
You didn’t see graphite.
You didn’t. You didn’t. Because it’s not there.
Are you suggesting the core exploded?
You’re a nuclear engineer. Please tell me how a core explodes. I’d like to know.
Are you stupid?
Then why can’t you?
I don’t… I don’t see how it could explode.
The boss throws his arms up, to say, See! I was right all along!
But it did.
Once a Cause Jumper has latched on to an explanation, like a bulldog with a rawhide bone, he’ll never let go.
Corollary to Danger Number 1 – Bullying the Group
When the danger first presents itself, another operator, clearly afraid – whether for his safety or for his job security is unclear – agrees with Dyatlov: “What you’re saying,” he tells the operators, “is impossible”.
How often do good people pipe down in a meeting because the loudest voice claims authority? Or because their interpretation of the facts doesn’t have momentum?
Good problem-solving efforts will document the bully’s opinion, but also make sure to capture any contrary views.
Of course, putting effort into capturing the facts is the cure here. And facts can be tricky little buggers…
Danger Number Two – Cherry Picking Data
The best problem solvers “go to the Gemba” – they want firsthand, visual evidence of the issue. They want to see, hear, smell, touch, or taste what’s happening. In this case, the Gemba is a deadly place to be!
Even so, throughout the episode, operators return from the reactor, covered in radiation burns, bleeding, limping, literally melting – and yet no one uses that data. (In fact, several operators complain about a specific metallic taste in the air – surely evidence this is not a normal fire).
In almost any situation, there’s objective, non-anecdotal data to be gathered – and at a nuclear power plant, the surest indicator of concern is radiation levels.
Dyatlov decides to take measure. He’s told that the reading is “3.6”.
“Not great, not terrible,” he says. (We learn later in the episode 3.6 is, in fact, terrible enough that the entire town should have been evacuated.)
Even without that knowledge, he should be concerned – the maximum reading on the meter is 3.6! It’s pinned in the red – there’s no way to know how bad the situation really is.
More robust test equipment is kept under lock and key (presumably because rank and file can’t be trusted). Once it’s retrieved, and turned on, it reads so much danger so quickly that it shatters.
Dyatlov snorts, “Typical, and his boss agrees. “They send us shit equipment”.
They try again, and a different one breaks: “Another faulty meter.”
How many times are circuit breakers reset – or pumps replaced – rather than chasing down the root cause? When every preventive measure in place trips, a good trouble shooter doesn’t presume every measure is flawed. She looks for the cause of the fause.
Danger Number 3 – Focusing on Blame
Dyatlov isn’t entirely certain of what’s happening, but he knows someone screwed up. (“You morons blew up the tank.”)
In fact, in the opening scene, we are told by our narrator, “All we want to know is, who to blame.”
As plant managers and bureacrats descend on the conference room, Dyatlov is grilled.
We ran the test exactly as the Chief Engineer suggested.
Dyatlov was supervising the test.
They reiterate the “low” (3.6) rad reading.
Then they actually applaud themselves on how well they are handling the crisis – while literally spewing nuclear waste into the air that their wives and children breathe.
Apollo 13 is often cited as a case study in13 classic troubleshooting triumphing over the fates of the gods themselves. But even then, after the explosion onboard the spaceship, the first words out of Tom Hanks’ mouth are: “What’d you do?”
There will always be time for blamestorming. But good problem solvers wait until the fix is in place before they even begin to worry about who screwed what up.
You will likely (hopefully) never face a situation like the engineers of the Lenin Nuclear Power plant in Chernobyl, Ukraine. But a Man of Action needs to be ready to move quickly and with limited data.
Sometimes that means jumping to the wrong course of action (deciding to cool a nonexistent core, for example). Once more information comes in – anecdotal or measured data – a Man of Action pivots. It’s a delicate balance – blindly adhering to an initial course isn’t “action” but activity.
A Man of Action will aggressively capture the facts, and challenge assumptions.