• United States



by Jen McCarthy

How What You Know Can Hurt You

Feb 06, 200610 mins
CSO and CISOData and Information Security

This is the true story of how a multinational company successfully avoided a business crisis. The moral of the story applies as well to government as it does to industry. We’ll get to the story in a moment.

Even though a “crisis” may seem to begin at a point in time—the hurricane makes landfall, the business faces bankruptcy—it may begin in decisions made, or not made, long before the crisis event. The decisions-made to strengthen building codes in Florida and California mean that many hurricanes and earthquakes aren’t crises there, even though similar hurricanes or earthquakes would be crises elsewhere. The decision-not-made to improve New Orleans’ levees (despite the frightening outcome of the simulated Hurricane Pam) made the real Hurricane Katrina much worse than it had to be. In other words, it takes more than brutal winds and rain to make a hurricane a crisis. It takes lack of preparedness, too.

The same holds true in business. The pirate attack last fall on a Seabourn luxury cruiser off the coast of Somalia was an emergency; it wasn’t a crisis, though, because the captain, crew and vessel were prepared (due to decisions-made), and fended off the attack. The theft of confidential consumer records from ChoicePoint Systems wasn’t a crisis because thieves tried to get into their computers. It became a crisis because they succeeded in getting data out.

An event that threatens a business’s survival is a crisis. Such events include deliberate attacks (product poisonings, cyberterrorism) and consequences of impersonal actions (changes in government regulation, trade wars). Sometimes the arrival of a new competitor is a crisis, sometimes not. The Grand Opening Sale at Steve’s Stove Store won’t cause sleepless nights at General Electric or The Home Depot. The entry of Microsoft’s Internet Explorer (at the attractive price of free) was definitely a crisis for Netscape, which succumbed as surely as if someone had spread a virus that crippled its software.

Not every crisis arrives with a neon light announcing “Crisis! Crisis!” Sometimes, like a cancer, it grows slowly. Did Sears pay attention when Sam Walton opened his first store? Did General Motors, Ford or Chrysler worry when Japanese econoboxes came to America in the 1970s? Based on results, apparently not. Today Wal-Mart is seven times the size of Sears, and GM, Ford and Chrysler are desperately fighting for their lives.

We have time to respond, we tell ourselves, and we don’t feel the risk or urgency because today is only a little bit worse than yesterday. (Maybe it doesn’t feel worse at all, if there’s no hurricane or pirate.) Nonetheless, these decisions-not-made contribute to crises. They can—and do—destroy the largest, most-powerful companies and venerable, beloved cities.

One reason why we fall prey to these common crises: We think we are prepared and we think we know what to do, and we are wrong. What we know ain’t so.

The story

Management at a large, multinational, household-name company (whose name I will not reveal for competitive reasons) was enjoying great success with a household product when they learned a powerful foreign competitor was eyeing its market. This was not a Steve’s Stove Store scenario. The competitor clearly had the reputation, resources, skill and motivation to win.

A thoughtful senior executive decided to conduct a crisis simulation—a business war game—to test and strengthen his defenses before the competitor made landfall. My colleagues and I conducted that simulation.

In the simulation, the company’s management team took the new competitor seriously. They did their best to make tradeoffs between sticking to their budgets and reinforcing their levees against the new threat. Meanwhile, a group of the company’s managers role-played the new competitor, planning to capture the market.

The group role-playing the competitor succeeded, handily elbowing the company aside. Moreover, the company realized it would be especially tough to dislodge the competitor after it established its position and began to enjoy the benefits. It would be like trying to undo a flood.

One happy capability of a computer-based simulation is that you can turn back the clock and see what would happen if you make different choices. That’s what we did in this business crisis. The home team, sobered by the outcome of the first simulation, took a close look at their competitor’s strategy. (That is, the strategy that their own colleagues devised for the competitor. Managers from the real competitor’s company didn’t participate in the simulation, of course.) Because they saw and believed the potential disaster, they thought hard and created a way to preempt the competitor. This time, the company’s proactive response to the imminent crisis held back the competitor.

The management team was so convinced by what they learned in the crisis simulation—convinced of both the threat’s severity and the antidote’s effectiveness—that they implemented their preemptive offense in a matter of weeks. The competitor’s attack never came.

Six months later, the company happened to hire a manager from the competitor’s company. They found out that the competitor had indeed planned to take the market, and with a strategy remarkably similar to the one the role-playing managers had anticipated. However, when the competitor saw the company’s preemptive move, it decided not to attack.

Were it not for the senior executive who saw the brewing crisis, who understood the impact of decisions-not-made and who brought in the simulation process, the attack would have come and probably succeeded. By preparing for the crisis, the company learned how to prevent the crisis. They protected their prosperity and careers, and they protected numerous jobs in their community.

Think about the company before the business crisis simulation. Most people would say it had every right to think it knew what to do. Its people had plans, they had budgets, they had forecasts, they had market research. They had talent, they had experience, they had brains, they had motivation. So do the companies that fail when competitors attack. What made this company succeed is that it had one uncommon benefit: a senior leader who questioned what he and his colleagues knew.

Leadership and drills

Note that preventing the crisis was about leadership and preparedness, not about data and drills. Getting more-accurate data about the competitor’s date of entry wouldn’t address the company’s real vulnerability any more than getting a more precise wind-speed reading would save a city from a hurricane. Only a senior leader could have the perspective and authority to identify the problem and implement a solution. In fact, that’s exactly what a senior leader is supposed to do.

Business executives and government officials don’t invest time or money in data, training or preparedness that they think they don’t need. Why would they? Spending time or money to get an answer they already know would be irresponsible and wasteful. However, if what they know ain’t so, then they (and the rest of us who depend on them) are in trouble.

Unfortunately, often what we know ain’t so. Over and over again, my colleagues and I have watched smart, dedicated managers get stunning insights when they see a problem in a new way in a crisis simulation.

The same is true in the public sector. We know that police and firefighters can talk on the radio. In some communities, that ain’t so because they use different radio frequencies. We know that hospitals can treat victims of a pandemic or terrorist bomb. But that ain’t so if fuel or roads are cut off, because hospitals need a constant stream of supplies. We know our emergency plans will work. As we’ve seen, tragically often, that ain’t so.

The moral of the story

We have two options to discover what we know that ain’t so. One is to wait for a crisis and see what’s so and what ain’t. That’s the very expensive way. The other is to proactively uncover it before a crisis hits. In other words, preparedness.

Whether you are in the private or the public sector, you can proactively uncover what’s so and what ain’t.

  • Ask questions that are different from those you might usually ask. For instance, in addition to asking whether a plan or function is ready to go, ask what would cause that plan or function to fail. Don’t go lightweight here.
  • Recognize that it is difficult to see intellectually or in the abstract what’s missing or what might not work. (If it were easy, plans would always work.) Exercises that explicitly and viscerally connect the dots, such as crisis simulations, are much more effective.
  • Work with senior leaders, not just front-line employees or emergency responders. Their decisions determine how the crisis will be tackled and even whether certain crises will hit.
  • Think less about precision and more about decisions. It almost never happens that the right decision depends on improving accuracy by another decimal place.
  • When a crisis hits, inexperienced executives or officials will feel a great deal of emotion, which can interfere with decision-making. One of the benefits of learning through simulation is that you process the emotions without placing real lives or real livelihoods at stake.
  • You have more options the earlier you begin, so think preparedness rather than management. You can preempt a competitor only if it hasn’t already committed to enter the market; you can vaccinate your computers or people only if they haven’t been infected yet; you can benefit from better levees only if you better them before the hurricane. Once things start getting really bad, you have fewer options, you have less time to make them work and you have to bet they’ll work the first time.
  • Learn expert decision-making skills: how to frame problems, how to avoid bias, how to build commitment. People aren’t born with those skills. Bring in decision-making experts to help, just as you would bring in experts to improve security, inventory management or public communications.
  • View crisis preparedness as a responsible investment. You can improve your company’s or community’s ability to deal with specific crises, and with crises in general, with programs and practice.

The moral of the story is to know there’s much we don’t know. We best protect our businesses and our communities by working through what could go wrong, rather than assuring ourselves everything will go right.

(We’ll conclude with a bit of trivia. Who said, “The trouble with people is not that they don’t know but that they know so much that ain’t so”? Like many people, I knew Will Rogers said it. I checked. I found that line (with minor variations) attributed to Josh Billings and Mark Twain as well as Will Rogers. Billings’ writings predate the others’.)