Memorability: Familiarity with the Bad Outcome

The bottom line so far: Familiarity with a risk decreases our outrage, which decreases our risk perception, which decreases our precaution-taking. Unfamiliarity, on the other hand, increases outrage, risk perception, and precaution-taking.

Familiarity with the bad outcome is a different kettle of fish altogether. It increases outrage – concern, anxiety, fear – and therefore increases caution. This is why seriously injured employees make good safety spokespeople. “Look at me! That’s the result of a moment of carelessness. It happened to me. It can happen to you.” This is also why high school driver’s ed courses show lots of videos of car crashes, or even take students on a field trip to the junkyard to see firsthand what carelessness can do to a car and its occupants.

Because familiarity with the bad outcome works in exactly the opposite way as familiarity with the overall situation or its risk-related aspects, I have usually labeled it “memorability” instead of familiarity. If you’ve seen things go badly wrong, then it’s memorable for you that things really can go badly wrong. Increased memorability leads to increased outrage and therefore to increased precaution-taking. And decreased memorability leads to decreased outrage and therefore to decreased precaution-taking.

What I am calling “memorability” or “familiarity with the bad outcome” is a special case of “the availability heuristic” – a label coined by Amos Tversky and Daniel Kahneman in 1973. Tversky and Kahneman summarized the availability heuristic as an unconscious belief that “if you can think of it, it must be important.” In the sorts of situations we’re talking about here, that translates to: “If you can easily imagine or remember awful things happening, you tend to think they’re likelier to happen.”

Memorability is thus the flip side of familiarity. Familiarity is the extent to which you have lived with the risky situation without anything going wrong. Memorability is how easy it is for you to remember or imagine something going wrong. Memorable risks are the ones that linger in our minds.

Personal experience is of course a powerful source of memorability. But so is media content – not just news, but also fiction. What makes genetically modified organisms (GMOs) memorable, for example? It’s mostly science fiction, especially science fiction movies. Virtually everyone has learned that if there are scientists splicing genes early on in the movie, there will be monsters rampaging by the halfway point.

Signals and symbols are also important sources of memorability. Odor, for example, is a very unreliable measure of risk; toxic chemicals may not smell, and smelly chemicals may not be toxic. But odors are exceptionally memorable signals – both pleasant odors and unpleasant ones. And human beings are hard-wired to respond with alarm to an unpleasant odor.

Or consider this example of a memorable symbol. As part of the cleanup of a former industrial site, the site owner removed contaminated soil from neighbors’ yards. All the soil was consolidated on the site itself, to await final disposal when a remedy was chosen. Waste cleanups being what they are, more than a decade passed without a final remedy. And the 30-foot-high mini-mountain of contaminated soil, covered with a black plastic tarpaulin, became a vivid ongoing symbol of the situation – visible from neighbors’ windows, always in their face as they went about their business. Most neighbors came to understand that even though the “high-haz pile” was an eyesore, in hazard terms it was just the tip of the iceberg. And they understood that any remedy would leave the site flat; the pile would be gone. Even so, as the debate over capping versus treatment versus offsite disposal dragged on, hauling away the high-haz pile remained important to many, even among those who were reluctantly willing to see the rest of the contamination capped on site. It made no technical, regulatory, or economic sense, but it made perfect symbolic sense.

Shrewd activists choose their fights in part for symbolic potential. A wonderful example is the 1989 controversy in the U.S. over Alar, an additive that used to help apples stay on the tree longer, until the Natural Resources Defense Council led a battle to get it banned – in the process persuading Congress to stiffen regulation of pesticides and agricultural additives generally. A lot of the memorability of Alar came from symbolism: the apple as a symbol of innocence and the poisoned apple as a symbol of betrayed innocence. From Adam and Eve to Snow White, we have vivid images of poisoned apples. Cartoonists used the poisoned apple imagery to symbolize the Alar controversy. More importantly, the poisoned apple imagery resonated in the minds of audience members, greatly amplifying their sense of the seriousness of the Alar threat. Imagine that Alar, instead of being an additive on apples, were something you put in pork. I think the NRDC is too smart to go after an additive in pork, but suppose it had. The media and public would not have been especially interested. Why? Nobody has any vision of pork getting any dirtier than it already is. But apples!

High memorability is particularly powerful when it is paired with low familiarity: “I don’t understand what you do, but I remember the night it all went haywire!” Memorability also feeds on itself, often by way of media coverage. A memorable event gets a lot of coverage and arouses a lot of public outrage. Increased outrage justifies more media coverage; that makes the risk more memorable; that leads to more outrage, more coverage, more memorability, more outrage, and on and on in an upward spiral.

If you’re trying to arouse or increase people’s outrage, memorability is your ally, and you should do everything you can to keep reminding your audience of the things – the events, news stories, movies, odors, whatever – that make the risk memorable.

What if you’re trying to reduce people’s outrage? Only time will make that event/story/movie/odor less memorable. All you can do to help move the process along is to keep acknowledging it. It is natural to prefer not to talk about it, of course. But your silence just makes it all the more vivid in everyone else’s mind, and all the more powerful as ammunition for your opponents. Nor is it sufficient for you to talk about it once or twice. That might be enough for an audience that’s barely aware of it in the first place: Get it on the table so no one can accuse you of hiding it, then move on to something more positive. This is the conventional advice of public relations professionals, and it makes sense for an inattentive mass audience. But not for stakeholders who see the situation in a whole new light because of this memorable event/story/movie/odor. Once or twice won't do the job. You have to wallow in it, continuing to talk about it not until you're sick of it (you were sick of it from the outset) but until your stakeholders are sick of hearing about it.

There is a chemical plant in Canada that had an accident many years ago, the plume of which came to be known in the media as “the Blob.” Journalists soon got in the habit of referring to the Blob any time that plant’s environmental problems made the news. The plant manager decided to put a stop to what he considered unfair harping on an ancient incident, so he put the word out to all employees to avoid the subject: It will cost you your job to mention the Blob. Inevitably, the strategy backfired. The spokespeople’s unwillingness to talk about the Blob made reporters all the more interested in asking about it, giving the original accident another decade or so of life. Instead of trying to ignore the Blob, plant representatives should have been talking it to death – comparing every current emission to the Blob, endlessly discussing what the company had learned from the Blob and how committed it was to never having another.

The basic relationship between memorability and precaution-taking is as I have described: The more memorable the risk is, the more outrage people are likely to feel – and, therefore, the likelier they are to take precautions or demand that others take precautions on their behalf. If you want high outrage, you want high memorability; if you want low outrage, you want low memorability.

But like the relationship between familiarity and precaution-taking, the relationship between memorability and precaution-taking can sometimes get complicated.

For one thing, too much outrage can be too much of a good thing, from the perspective of precaution-taking. People who get too scared may go into denial or paralysis – like a child so frightened of tooth decay that he avoids brushing his teeth in order not to have to think about it, or a woman so frightened of breast cancer that she avoids checking for lumps for the same reason. It’s hard to provoke that much fear; in precaution advocacy, as a rule, the more outrage the better. But not always.

Also, people can get desensitized to memorably frightening stimuli – and then it takes scarier and scarier stimuli to keep them in a cautious frame of mind.

But the most important complication is this: Familiarity with the bad outcome has to feel genuinely bad in order to provoke more outrage and more precaution-taking. Remember that nobody died at Three Mile Island. For most people, nonetheless, the lesson of TMI was that many, many things went wrong there, proving that nuclear power isn’t safe. But for nuclear power supporters, the lesson of TMI was that U.S. nuclear plants are so safe that even when many, many things go wrong, nobody dies.

The lesson of a near-miss is always ambiguous. Does it mean that “defense in depth” is working and a serious accident is unlikely? Or does it mean that we came much too close to disaster and need to beef up our precautions?

Research dating back to the 1980s has consistently found that minor accidents and near-misses have a bigger “signal effect” when the relevant risk is unfamiliar (especially if it’s also highly dreaded). If something goes wrong with an unfamiliar risk, we’re likely to take it to heart, to see it as vivid evidence that the risk is unacceptable. If something goes wrong with a familiar risk, on the other hand, we tend to shrug it off, seeing it as an exception rather than as a warning. So a near-miss in a factory may terrify neighbors while leaving employees blithely unworried.

This may help explain why corporate communicators routinely play down the significance of a near-miss in what they say to the media and the neighborhood (“no one was ever in danger”)), while the company’s safety people play up its significance in what they say to employees (“don’t ever get this close again!”).

Best practice for near-miss risk communication is to offer both interpretations to all audiences: “Any violation of our safety protocols is a reason to worry. Yes, the backup systems worked the way they were designed to work, so nothing bad happened. But we got closer to a serious accident than we want to get. Relying too much on backup systems is a recipe for disaster.”

The lesson of living through a dangerous situation – a hurricane, for example – is similarly ambiguous. Most veterans of major hurricanes take hurricanes seriously; experienced Floridians routinely board up their windows and evacuate their homes, sometimes several times in the same season. But some veterans of serious hurricanes become overconfident instead: “I lived through Andrew and Wilma and I’ll live through this one too.” As Superstorm Sandy approached New York City in late October 2012, too many residents of low-lying, flood-prone parts of the city refused to obey evacuation orders, pointing out that they had survived Irene a year earlier without difficulty.

So it’s important to tell people who evacuated and whose homes were unscathed that they were nonetheless smart to get out, and that they should evacuate again the next time there’s a hurricane headed their way. And it’s important to tell people who didn’t evacuate and whose homes were unscathed that they got lucky this time but shouldn’t push their luck – just listen to what happened to so-and-so who got cocky after Irene and nearly died in Sandy.

Familiar turned strange

The familiar-turned-strange is the stuff of nightmares and horror films. But it often fuels risk controversies as well. This is, in fact, the worst sort of unfamiliarity: You thought you were familiar with the situation, when suddenly it went weird on you. Although familiarity does help keep outrage low, there is always the possibility of this sort of boomerang effect: If the outrage manages to crash through the familiarity bulwark (to mix the metaphor a little), it is all the worse for having corrupted what had felt safe.

The example I gave earlier in this column: People resist worrying about risks in the most familiar venue of all, their own homes. That’s why it’s hard to arouse sufficient outrage about radon, carbon monoxide, home fires, home accidents, etc. But suppose you learn that your home has been contaminated by dangerous emissions from a nearby factory, or by toxic chemicals used by an unscrupulous or incompetent contractor you brought in to do some repair work. Now the fact that it’s your home – your home! – makes the outrage all the worse.

People whose homes have been robbed often end up selling and moving elsewhere. Especially if they really loved their house or apartment, the comfortable feeling it gave them may not be retrievable. (Some people respond the same way to illness: My body has betrayed me.)

Similarly, we have trouble believing a familiar and much-loved consumer product could be dangerous – but once we decide it really is, we skip right from underreacting to overreacting. “My God,” an Australian colleague joked, “even Vegemite!” And employees who get used to the manufacturing processes they work with tend to see any change as a threat. The new solvent may actually be safer than the old one, but it smells different – and unless it’s introduced with a thorough familiarity-building campaign, it will therefore generate more outrage. These are all everyday examples of the “familiarity betrayed” boomerang effect.

Or think about technologies whose presence is highly familiar but whose actual workings we find totally mysterious. For me it’s my computer. I use it constantly and depend on it totally. But if it malfunctions I am totally at a loss. Brought face-to-face with my own ignorance, dependence, and vulnerability, I feel betrayed and incredibly anxious. Watching a hardware expert muck around in my computer’s innards is like undergoing surgery without anesthesia.

Copyright © 2012 by Peter M.  Sandman