Note: This is part one. Part two is here.
Apropos of today’s escalation of the Fukushima reactor disaster to “7″ (Chernobyl) on the international scale, there is something about the word “radiation” that bypasses rationality and whispers right to the lizard brain. Perhaps one or two other words have the same power to spook. “Pandemic” comes to mind. But that aside, there is little else I that has quite the same potency as “radiation.” Almost all the connotations are negative: Contamination; the atom bomb; fallout; nuclear waste disposal; dirty bombs; Godzilla; the ugly, yellow trefoil symbol. Even radiation’s positive connotations are negative. Nuclear medicine might save you…if you have cancer! And the New York Times has spent the past year running a series of stories about nuclear medicine gone horribly wrong. Radiation? Duck and cover.*
All this would seem to make any public communication about radiation hazards tricky. I can’t, however, speak from experience. Perhaps fortunately, Imagethief has never been called upon to participate in crisis communications involving radiation. Or nerve gas leaks or alien invasion or anything with quite the same pilo-erective zing. I’ve done food problems a couple of times. Also, years ago I helped an insurance industry association roll out a chain of inspection centers for mandatory post-collision inspections, which may actually be the closest I have ever come to being between a client and a torch-bearing mob. In terms of public sentiment, insurance and radiation aren’t actually all that different.
The challenge in a public safety communication situation is to find that delicate line between keeping people well informed and prepared for the worst, and going overboard and triggering a destructive panic. This isn’t a trivial thing. To get an idea of how easy it is for hysteria to blossom in the right circumstances, recall that at the height of the Japan nuclear issue there was a run on table salt (rumored to prevent radiation sickness) in Beijing, 2000 kilometers upwind of the Fukushima reactors.
What’s the worst that could happen?
The natural tendency in crisis communications is to undershoot and be tight-lipped. This happens for all kinds of reasons, of which avoidance of panic is perhaps the only noble one. The others include optimism or wishful thinking that the worst is over, poor communication planning, bad communication culture, and the general tendency to choose ass-coverage over ‘fessing up for any number of legal, reputational and personal reasons.
The problem with undershooting communication in an escalating crisis is that you wind up making a series of hopeful the-worst-is-over pronouncements, each of which is rapidly obliterated by events. TEPCO is a case study of this syndrome. So is BP from last year. Both companies wound up in a kind of reverse-Chicken-Little situation, where public trust evaporated and no one believed them anymore. In both situations, government wrested much of the public communication responsibility away. Getting publicly ejected from the drivers’ seat is not good for credibility.
If you were forced to choose between timidity and risk of panic, you might choose timidity as the lesser of two evils. But in public safety situations the rumors and misinformation that can drive hysteria are likely to thrive in the vacuum created by poor communication or lack of trust. The Beijing salt-run is illustrative. Ask yourself: strictly hypothetically speaking, how conducive would it be to public order to have an escalating radiation crisis being managed by an organization that no one believes?
This is the situation that TEPCO appears to have found itself in. A good analysis of their situation from a communication point of view can be found in PR trade The Holmes Report. Paul Holmes introduces consultant Peter Sandman’s useful “hazard vs. outrage” model of risk communication. Hazard is the actual danger to people. Outrage is the emotional reaction provoked. Varying ratios of hazard and outrage inform communication strategies in risk situations. The irony for TEPCO is that the actual hazard has to this point been pretty low. But the outrage factor is, if you’ll pardon the pun, nuclear. Holmes lists some of the key factors driving that outrage:
- Coerced risk causes more outrage than voluntary risk, and the majority of people living close to nuclear plants did not volunteer to be exposed to the risk;
- Industrial risk causes more outrage than natural risk, and nuclear is pretty obviously industrial;
- Exotic risk causes more outrage than familiar risk, and most people are far less familiar with nuclear power than they are with oil and gas;
- Memorable risk causes more outrage than unmemorable risk, and nuclear incidents—Three Mile Island, Chernobyl—are extremely memorable;
- Dreaded risk causes more outrage than undreaded risk, and activist groups have been successful in creating and nurturing nuclear dread;
- Catastrophic risk causes more outrage than chronic risk, which may be the biggest challenge, since nuclear incidents tend to be catastrophic, while fossil fuels do their damage primarily by creating chronic illnesses and environmental problems;
- Risk controlled by others causes more outrage than risk controlled by individuals (which is why people fear air travel more than car travel), and nuclear power is completely beyond an individual’s control.
A longer and very interesting analysis of the Fukushima situation by Mr. Sandman himself can be found on his website. Mr. Sandman’s conclusion with regard to crisis communication is: Always err on the alarming side, until you are absolutely 100% certain the situation cannot get any worse. Else risk your credibility and thus your ability to respond. Communications professionals would also do well to read security expert Bruce Schneier’s writings about how people evaluate and respond to risks, which dovetails with many of Mr. Sandman’s points. I recommend Schneier’s book, Beyond Fear. It’s a few years old, but still relevant.
In TEPCO’s defense, the playbook for nuclear catastrophe communication in the midst of a general natural disaster is pretty thin. There are exactly zero precedents. Even absent the natural disaster component, there have been only three previous reactor accidents of any scope: Windscale, Three Mile Island and Chernobyl. Only Three Mile Island is really analogous to Fukushima, but the communication there was also blown, as Mr. Sandman points out. But it shouldn’t really matter whether there is a precedent or not. Nuclear power, which has always had a bigger outrage than hazard problem, really deserves over-preparation.
In his book The Big Short author Michael Lewis proposes that the people who made fortunes betting on the collapse of the American housing bubble shared the iconoclastic traits of both being able to envision the worst happening (the bubble collapsing) and, in the face of social pressure, to act on that vision. It struck me upon reading that book that being able to envision the worst happening and preparing for such an outcome is a really valuable skill for communications professionals. To come back to Chicken Little, it’s not that a PR person should run around screaming that the sky is falling, but a PR person –and indeed any member of a company’s crisis management team– should be able to pose the question, what would we do if the sky fell?
Even if –and this is important– even if others in the organization suggest that such a thing is unthinkable. After all, just because something is unthinkable doesn’t mean it won’t happen.
In part two: What the hell is an “apocalypse box” anyway?
Note:
*If you haven’t seen the “Bert the Turtle” public service announcement on how to survive an atomic attack, it’s well worth watching. A surreal mix of 1950′s cold war terror and sunny “you can survive” optimism with a Ward Cleaver voice-over that must be somewhere in Billy West’s stack of reference material. “It’s such a big explosion that it can…break windows all over town.” Indeed. It’s a YouTube link, so get your VPN ready. If it’s working.