CSO Disclosure Series | Reporter's Notebook: The United States of TMI

Lead paint in toys. Brain-eating amoeba. Identity theft. Drowning in sand. We know more than ever about the risks all around us. Do we know what disclosing them all is doing to us?

1 2 Page 2
Page 2 of 2

And we’re making it harder by disclosing more risks than ever to more people than ever. Not only does all of this disclosure make us feel helpless, but it also gives us ever more of those images and experiences that trigger the intuitive response without analytical rigor to override the fear. Slovic points to several recent cases where reason has lost to fear: The sniper who terrorized Washington D.C.; pathogenic threats like MRSA and brain-eating amoeba. Even the widely publicized drunk-driving death of a baseball player this year led to decisions that, from a risk perspective, were irrational.

THE BEST EXAMPLE of the intuitive brain fostering bad decision-making is terrorism, which produces the most existential nausea of all. On a group scale, it can be argued that decisions following 9/11 were poor, emotional and failed to address the risks at hand. Not only that, those decisions took necessary but limited resources away from other risks more likely to affect us than terrorism. Like hurricanes.

The effect is identical with individuals. Ask 100 people which is a bigger danger to them, getting five sunburns or getting attacked by terrorists, and many will cite the latter.

That’s intuitive. Terrorism is, well, terrifying. But it’s also exceedingly rare. In this excellent paper, University of Wisconsin Professor Emeritus Michael Rothschild deigns to conjure the awful to make an important point. He shows that if terrorists were able hijack and destroy one plane per week and you also took one trip by plane per month in that same time, your odds of being affected by those terrorist attacks are still miniscule, one in 135,000.

Even if that implausible scenario played out, you would still be about 4.5 times more likely to die from skin cancer next year (one in 30,000) and 900 times more likely to get skin cancer if you’ve had five sunburns in your life (one in 150).

But that doesn’t matter. For sunburns, we have all kinds of ways to exert control and make us feel less helpless: hats, SPF 50 lotions, perceived favorable genetic histories and self-delusion--"My sun burn isn’t as bad as that guy’s." We have volume knobs for that radio.

For terrorism, we have no volume knobs.

ONE WAY TO CREATE them is to kindle that analytical part of our brains--to provide as much factual context about the risk we’re disclosing as possible. The more we can provide, the more we can override our dread.

Unfortunately, the lack of control over who discloses what and how they do it has actually fostered less context and fewer useful statistics, not more. So toys in my house have lead in them. What does that mean? Moreover, what are the chances someone in my house will be affected by that lead, and how?

Without context, we are anxious. Risks appear both more random and more likely than they really are. We lack of control. We curl up and whimper.

Why don’t we include context with our risk disclosure? For some, it would run counter to their agenda for disclosing the risk. “Brain-eating amoeba kills 6” gets me to click. “Rare bacteria with 50 million to one chance of affecting you”--not so much.

Fischoff believes that another factor at play is what he calls the myth of innumeracy. "There’s an idea that people can’t handle numbers, but actually there’s no evidence to support that," he says. "I believe there’s a market for better context with risks."

Slovic is less certain. He says numeracy varies widely, both because of education and predisposition. "Some people look at fractions and proportions and simply don’t draw meaning from them," he says. To that end, Slovic thinks we need to create images and experiences that help us emotionally understand the risk analysis, that make us feel just how rare certain risks are. Make the analytical seem intuitive.

It’s difficult to do this well. For example, Slovic remembers once when he was trying to explain chemical risks that occurred in the parts-per-billion range. People can’t relate to what a part-per-billion means, “so we said one part per billion is like one crouton in a thousand-ton salad," he says. He thought people would get that. Unfortunately the analogy backfired. It made people exaggerate the threat, not downplay it. Why? "Because I can imagine a crouton," says Slovic. "I can hold it in my hand, and eat it. I can’t really picture what a thousand ton salad looks like. So I focus on the risk, the crouton, because I can’t compare it to a salad that I can’t even imagine."

Another common mistake when putting risk in context is to use the wrong numbers, specifically focusing on multipliers instead of base rates, says Fischoff. For example, cases of the brain eating amoeba killing people have tripled in the past year. Yikes! That’s scary, and good for a news story. But the base rate of brain-eating bacteria cases, even after rate tripled, is six deaths. One in 50 million people. That’s less scary and also less interesting from the prurient newsman’s perspective.

But even that “one in 50 million” characterization is problematic. It still causes people to exaggerate the risk in their minds, a phenomenon called "imaging the numerator." In one experiment that showed the dramatic effect of imaging the numerator, Slovic notes, psychiatrists were given the responsibility of choosing whether or not to release a hypothetical patient with a violent history. Half the doctors were told the patient had a "20 percent chance" of being violent again. The other half were told the patient had a "one in five" chance of being violent again.

Startlingly, the doctors in the "one in five" group were far more likely not to release the patient. "They lined up five people in their minds and looked at one of them and saw a violent person." They imaged the numerator. On the other hand, 20 percent is an abstract statistic that hardly seems capable of violence.

It sounds illogical, but our minds think that "one in five" is riskier than "20 percent."

ESPECIALLY WITH PARENTS, one of those illusory controls we use to manage so much risk disclosure is to look at our own experiences, or lack of experiences. To wit, "I had lead toys as a kid and, well, I turned out okay." In fact this is precisely what I told myself when I scanned the beach for killer sand holes. Look, I had spent parts of every summer of my life--maybe a year total--at the beach, and, well, I turned out okay. In fact, I had never witnessed, never even heard of the collapsing sand phenomenon.

But I knew after talking to Fischoff and Slovic that this was mostly self-delusion, and not nearly enough for the analytical part of my brain to override the intuitive part. I’d need more to stop worrying about the beach eating my kids. So I decided to try to put math up against the myth. To come up with some rigorous analysis, and relatable images that would, finally, scotch the dread.

The sand story quotes a Harvard doctor who witnessed a collapsing sand event and who has been making awareness of it his personal crusade ever since (the intuitive response to risk, like parents pulling their kids off the bus). The doctor even published a letter in the New England Journal of Medicine to raise awareness of “the safety risk associated with leisure activities in open-sand environments.” In the letter, he documents 31 sand suffocation deaths and another 21 incidents in which victims lived, occurring mostly over the past 10 years. He believes these numbers are low because he based them only on news reports that he could find.

Using rough math, let’s assume he’s right that there were more cases, and double his figures, guesstimating 60 deaths and another 40 cases where victims lived over the past decades. That’s 100 cases total, or about ten per year, including six deaths and four more where the victim lived. Of course, not everyone on the planet goes to the beach, so let’s say that the beachgoer population comprised only one percent of the world population in the last year, about 60 million people. (In fact it’s probably higher than this, but assuming a smaller number of people at the beach to begin with actually makes the risk per person higher.) With this conservative estimate of beach-goers, there would be six deaths by sand per every 60 million beachgoers. One in 10 million beachgoers are likely to die in a sand hole.

That seems unlikely, but how unlikely? I’m still imaging the numerator. I’m seeing that one kid die, that teenager playing football who disappeared before his friends’ eyes. I’m thinking of his mother, grieving, and saying desperately in the story, "People have no idea how dangerous this is."

I need something better, some kind of image. I found this oddly pretty graphic at the website for the National Safety Council (which hasn’t credited the designer):

Graphic from National Safety Council

Here, the size of each circle represents the relative likelihood of dying by the listed cause. One of every five deaths is from heart disease, so that circle is 20 percent of the overall circle (represented by the red arc), which represents chances of death by any cause, one in one. The stroke circle is about five times smaller than the heart disease one, as the risk is about one-fifth as common and just four percent of all deaths. The circles continue to get smaller down through rarer and rarer causes of death. The smallest circle on this particular map is death from fireworks discharge, a fate suffered by one of every 350,000 people just a few pixels on the screen.

On the above graphic, the circle for suffocation by sand would be almost 30 times smaller than the fireworks circle. Invisible, which is fitting since experts call such risks "vanishingly small." What’s more, it would be nearly 10,000 times smaller than the circle for "drowning.” In other words, when I’m at the beach, I should be thousands of times more worried about the ocean than the sand--and even then, not too worried about either.

But I’m still not quite convinced. The problem with vanishingly small risks is they’re just that, hard to see, even in your mind. Once the circle disappears, it’s hard to understand how much smaller it’s getting.

So I need an image with which I can relate to all of the proportions. How about an Olympic-size swimming pool? If causes of death were represented by the 660,000 gallons of water in that pool, then heart attacks would take up 132,000 gallons, enough to fill about 15 tanker trucks. Death by fireworks discharge would take up roughly two milk jugs of the pool’s water.

And of all that water in the swimming pool, dying in a sand hole would take up about 17 tablespoons, less than nine ounces.

photo of kid on beach

I CAN’T CONTROL the rapid rise in risk disclosure, or the ubiquitous access to all of it. It’s probably not going to stop, and it’s unlikely that, in general, it will become more analytical. But now I know I can offset the cumulative effect it has on me, at least a little bit. I have that salve now. When dread creeps in, I can calm myself down. I can go to the beach without scanning for holes. I can let my kids play with their pails and shovels knowing they’ll be okay. Knowing that drowning in sand is, almost literally, nothing to worry about.

Copyright © 2008 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
7 hot cybersecurity trends (and 2 going cold)