Missing the real opportunity of Snowden and Manning

Failing to recognize the underlying point of these and similar actions, we miss out on the necessary conversation we need to have as an industry, and in our organizations

The actions of Snowden and Manning have dominated the headlines. Politics aside, the challenge centers on trust for those with access to sensitive information and the larger threat of rogue employees.

The common reaction is an almost knee-jerk demand for more, stricter controls. In most cases, these controls won't work.

A few years ago I conducted a training session at a military base (going to keep this a bit generic). It was an intensive five-day training, and I offered the students a bundle of electronic supplemental materials.

At the time, providing a link to a website was not feasible. Instead, the quicker and more reliable method was "sneaker net 2.0," a USB drive.

When I held up the slightly-larger-than-my-thumb drive, a whopping 16MB, and asked who wanted it, I was prepared for a jeer about whatever virus I put on it.

No jeer. Instead, I watched faces wrinkle a bit, staring at the plastic object in my hand. Finally, someone spoke up, "we can't put that in our computers."

"Wow, you guys take security seriously!"

"Oh no, it's not that. We just installed a new security program, and if we insert the drive, it will be erased, formatted, and then encrypted."

"Okay. Well, thoughts on how I can get this information to you?"

As if I issued a command, at least five people ducked below their decks at the same time. Puzzled, I just watched. A few seconds later, each popped back up. In their outstretched hands a shiny new CDR.

"Use this."

Wait. What?

The program painstakingly implemented a few weeks before the training class only protected against USB drives. As a result, the new version of sneaker net was to use CDR (and CDRW).

A perfect "teachable moment," I had to ask how this happened. Turns out a private -- in a non-technical, non-security function figured it out. Within a few hours of the install.

Most of the base knew within the day. In this case, the private taught the security team. They were grateful.

Lots of lessons here. Some good. The basic point is more controls only create more opportunity, and more incentive.

Back to Snowden, Manning, and other rogue employees. Actually, forget the disgruntled, rogue employees. Over the years, an overwhelming majority of people freely admit they have or will take information when they leave their jobs -- regardless of what the company policy says.

More controls just present new obstacles and create new opportunities for them. They are not likely to stop a determined insider - regardless of their intention.

The conversation we need to have

Someone knew. Someone saw. Probably lots of people. At some point, the situation didn't seem or feel right.

Yet no one acted.

Why?

That's the conversation we need to have.

Especially because as we move forward, we need people. We need human intelligence making -- and communicating -- good decisions.

In nearly two decades of working at the intersection of people, technology/security, and business, I've found this breakdown in decision making and communication happens frequently. 

It's a bigger problem than rogue insiders-- but the recent press on these two individuals present the perfect opportunity to frame the conversation. Here are some ideas on what we should discuss:

Why didn't they do anything?

No easy, single answer.

The pattern that I continue to observe is the disconnection between actions and impacts. I describe this as the Human Paradox Gap. It means that individuals lost sight of the value of their actions in a personal and organizational context.

Bluntly, it wasn't their problem. At least, they didn't realize why it was their problem.

The challenge is deeper: put yourself in their shoes

You observe something with a coworker. Something about it just isn't right. It's not clear, though. Could be nothing.

Maybe you think about it on the commute home. Tough call.

If you felt inclined to take action, who would you tell? Did you just answer as a security professional?

Change the lens.

Put yourself in the situation where you have no security experience, except for annual "training" and the spate of "security" information telling you what you're doing wrong.

Now who do you tell? Who do you trust to tell?

And when you decide to tell someone, what do you tell them? What is the process? Specifically, what happens next?

Most importantly, what happens to the person who steps forward with a concern? Are they taken seriously? Or are they ridiculed, mocked, and told to mind their own business?

What do we do?

If we want people engaged in the process, we need to let them take responsibility. We need to develop transparent systems and demystify what happens.

It's time to reinforce that if something doesn't "feel right," stepping forward is not only okay, it's rewarded.

Clearly it's a balancing act (as suggested when the company reported a former employee for searching for pressure cooker bombs).

That's why we have to reconsider communication and work to improve the entire system across the organization (and the industry). It'll take time, and a few mistakes, to get right. That means the need to quickly realize the mistakes, recover, learn, and continue.

The first step

As a starting point, this is a conversation we need to have in the industry. In the comments here, on twitter, and at the variety of meetings, chapters, and conferences around the world.  I'd like to be involved in as many as possible. I have time on my calendar. This is important.

Take the same conversations into the organization, across the different groups, and without presenting answers. Just ask the questions to start the dialog and let others guide the process.

It's a simple, powerful way to allow people to realize the impacts of actions, build some trust with the security team, and start to take back responsibility. It's a great way for us to learn what the real situation is, and how we can better support the business.

What are we waiting for?

Copyright © 2013 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)