A five minute read.
Just as most adults think they’re a “better than average“ driver, and at least ninety percent of managers rate themselves as “better than average” communicators, if you assemble group of professionals and ask them to rate their problem-solving skills, most will say they are among the best.
Confidence, particularly in trouble-shooting abilities, is great. Knowing that, no matter the challenge, you’ll be able to face it down and find the solution? That’s what being a Man of Action is all about.
Often, though, this confidence isn’t from an ability to face the unknown and successfully navigate through a murky situation – it comes from the memory of simply making it out alive the last time.
Those that rate themselves as great problem-solvers generally have experience with a specific issue. They’ve “seen this” – or something very much like it – before. They are faster than most at recognizing the concern and can quickly recall how they got out of it. (Not necessarily “solved” it – or it wouldn’t be back, would it?)
Their challenge comes when facing something new. Or something that presents as a “normal” problem, but has all the hallmarks of a real corker.
The danger is in knowing enough about a problem to be unafraid to tackle it, but not knowing enough to recognize how little is really known.
A textbook case of this features in HBO’s Chernobyl. The premiere episode perfectly captures the hubris particular to the boldest of cause-jumpers: calm, in control, smugly self-assured, but absolutely and dreadfully wrong. This is tragedy in its classic sense: Deputy Chief Engineer Anatoly Dyatlov (perfectly captured by Paul Ritter), though well-intentioned, brings ruin upon himself – and half of the Soviet Union- because he refuses to see the truth.
Craig Mazin’s script highlights some classic Dangers of the Cause Jumper.
Danger Number 1 – Believing Our Initial Instinct Is Correct, No Matter What Others Tell Us
If there can be a Most Classic of the “classic Dangers”, this is it.
What just happened?
I don’t know.
A third operator enters the control room.
Something blew up…
Dyatlov thinks through this.
The turbine hall… The control system tank… Hydrogen… You morons blew the tank.
People don’t get promoted without knowing a few things, and Deputy Chief Engineer is no different: he “knows” that if there’s a fire in the hall, a tank exploded. Every time he’s seen a tank explode, there’s been a fire like this one. So if there’s a fire like this one, his thinking goes, then the only reasonable explanation is an exploded tank. This is faulty reasoning, of course – just because A always causes B, it doesn’t mean that B is always caused by A.
He instructs the operators to send cooling water into the core. But one of the operators tells him, “There is no core.”
He is dismissive: “You’re confused. Cores don’t explode.”
Dyatlov orders them to continue pumping water into the core. The core they (correctly) believe is gone. He says “we’re wasting time”, and they are – they’re doing something that won’t help, to something that isn’t there anymore.
Later, Dyatlov does some mental math to convince himself of his original diagnosis. He asks if his instructions were followed, if the cooling rods were lowered into the core. An operator, at a complete loss, tells him, “The core’s gone.”
Dyatlov rages, tells the operator he’s delusional. When it’s pointed out that this operator is covered in radiation burns, Dyatlov sticks with his diagnosis. “Ruptured condenser lines will do that. He will be fine. I’ve seen worse.”
As bureaucrats fill the conference room, another engineer describes seeing a piece of glowing graphite on the ground. Dyatlov tells him:
You didn’t see graphite.
You didn’t. You didn’t. Because it’s not there.
Are you suggesting the core exploded?
You’re a nuclear engineer. Please tell me how a core explodes. I’d like to know.
Are you stupid?
Then why can’t you?
I don’t… I don’t see how it could explode.
The boss throws his arms up, to say, See! I was right all along!
But it did.
Once a Cause Jumper has latched on to an explanation, like a bulldog with a rawhide bone, he’ll never let go.
Corollary to Danger Number 1 – Bullying the Group
When the danger first presents itself, another operator, clearly afraid – whether for his safety or for his job security is unclear – agrees with Dyatlov: “What you’re saying,” he tells his colleagues, “is impossible”.
How often do good people pipe down in a meeting because the loudest voice claims authority? Or just won’t stop repeating themselves? The rational thinkers lose momentum, and the false narrative becomes the only narrative.
A Man of Action will document the bully’s opinion, but also make sure to capture any contrary views.
Of course, putting effort into capturing the facts is the cure here. And facts can be tricky little buggers…
Danger Number Two – Cherry Picking Data
To solve a problem, the Man of Action “goes to the Gemba” – wanting firsthand, visual evidence of the issue. They want to see, hear, smell, touch, or taste what’s happening. In Chernobyl, though, the Gemba was a deadly place to be!
Even so, throughout the episode, operators go to and return from the reactor, covered in radiation burns; they are bleeding, limping, literally melting – and yet no one uses that data. (Several operators complain about a specific metallic taste in the air – surely evidence this is not a normal fire. That data is never captured).
In almost any situation, there’s objective, non-anecdotal data to be gathered – and at a nuclear power plant, the surest indicator is radiation levels.
Dyatlov decides to take measure. He’s told that the reading is “3.6”.
“Not great, not terrible,” he says. (We learn later in the episode 3.6 is, in fact, terrible enough that the entire town should have been evacuated.)
Even without that knowledge, he should be concerned – the maximum reading on the meter is 3.6! It’s pinned in the red – there’s no way to know how bad the situation really is.
More robust test equipment is kept under lock and key (presumably because rank and file can’t be trusted). Once it’s retrieved, and turned on, it reads so much danger so quickly that it shatters.
Dyatlov snorts, “Typical“, and his boss agrees. “They send us shit equipment”.
They try again, and a different one breaks: “Another faulty meter.”
How many times are circuit breakers reset – or pumps replaced – rather than chasing down the root cause? When every preventive measure in place trips, a good trouble-shooter doesn’t presume every measure is flawed. She looks for the cause of the cause.
Danger Number 3 – Focusing on Blame
Dyatlov isn’t entirely certain of what’s happening, but he knows someone screwed up. (“You morons blew up the tank.”)
In the opening scene, we are told by our narrator, “All we want to know is, who to blame.”
Plant managers and bureaucrats descend on the conference room and grill Dyatlov.
We ran the test exactly as the Chief Engineer suggested.
Dyatlov was supervising the test.
They reiterate the “low” (3.6) rad reading.
Then they actually applaud themselves on how well they are handling the crisis – while literally spewing nuclear waste into the air that their wives and children breathe.
Apollo 13 is often cited as a case study in trouble-shooting skills triumphing over the fates of the gods themselves. But even then, after the explosion onboard the spaceship, the first words out of Tom Hanks’ mouth are: “What’d you do?”
There will always be time for blamestorming. But good problem solvers wait until the fix is in place before they even begin to worry about who screwed what up.
You will likely (hopefully) never face a situation like the engineers of the Lenin Nuclear Power plant in Chernobyl, Ukraine. But a Man of Action needs to be ready to move quickly and with limited data.
Sometimes that means jumping to the wrong course of action (deciding to cool a nonexistent core, for example). But once more information comes in – anecdotal or measured data – a Man of Action pivots. It’s a delicate balance – blindly adhering to an initial course isn’t “action” but activity.
A Man of Action will aggressively capture the facts, and challenge assumptions.