Paul Wing: Privacy's Northern Light

Paul Wing, former Scotiabank information security leader, says Canada and the United States need revamped privacy policies and practices

Paul Wing believes that with privacy, things have gotten out of hand, and he's working to restore order through ambition and idealism. One of Wing's current projects is to create real governance around privacy—how we authenticate, how data is stored and destroyed, and so forth. He's developed several privacy principles that he would apply to businesses and governments across the globe. He wants privacy to become an ethical cornerstone of doing business in a sustainable way. He wants privacy to be on a par with responsible environmental and child-labor policies. The kicker is that >Wing believes that by doing this—by making privacy a moral imperative—we will improve the bottom line of both security and the business.

For two decades, Wing was head of Information Security at Scotiabank, where he implemented two-factor authentication as far back as 2000. He was Canada's privacy representative to the International Organization for Standardization (ISO) and the Organisation for Economic Co-operation and Development (OECD). He coauthored the book Protecting Your Money, Privacy and Identity from Theft, Loss and Misuse—Practical Steps for Today's World. When a Canadian magazine bought and published the Canadian privacy commissioner's personal phone records, the privacy commissioner called Wing, now an independent consultant, to seek advice and help deal with the problem. CSO senior editor Scott Berinato spoke to Wing at length about these and other privacy-related issues, including transborder data flows, his model for privacy governance and what's so interesting about his heat and hydro bill.

CSO: How is it possible for journalists to buy the Canadian privacy minister's personal phone records for $200?

Paul Wing: Amazing, right? It was just a magazine doing a proof of concept, but it produced a lot of angst up here. I've been doing a fair amount of work to get my head around what the issue is here, and I think it comes down to this: There's no common reference point for what is a best practice for authentication, for privacy, for managing risk.

How do you mean?

Here's an example. To get access to a copy of my heat and hydro bill, the company requires me to use a seven-digit, case-sensitive alphanumeric password with a minimum of one alpha, one numeric and one capital. That's just to look at my statement. That's stronger authentication than some banks require for online banking. At ATMs we use four-digit PINs, technology from the '70s. I looked around my house and the only other technology I have from the '70s is my record player for my LPs and the light switch. How can a four-digit PIN, which, by the way, I'm not required to ever change, adequately protect banking? This is why "shoulder surfing" [reading someone's PIN as they punch it in] happens. And yet, on the other hand, I have to change my e-mail password every 30 days even if I don't give a damn about that. So there's no logic behind what kind of protections we put where.

This all seems to focus on security, specifically authentication. The lack of logical risk-based authentication policies makes private data more vulnerable by opening up avenues to identity theft.

Yes. Strong authentication can ensure that private information is more likely to remain private. No one's going to be able to get my heat and hydro bill statement, but why does that have stronger authentication than the cash machine? Because there are no best practices or uniform, risk-based policies. We're trying to develop those now.

So how we authenticate is one problem. Isn't what we use to authenticate another problem?

Absolutely. We're working on a governance model with several principles, one of which would define what is a valid piece of personal information to authenticate with. Our principle is, you should never use personal facts that don't change in your life as an authentication tool: Date of birth. Social Security number. Town you were born in. Mother's maiden name. There are also things that change very little—our mailing address, for example. Those should be avoided, too.

But wait. The uniqueness of these facts is what makes them appealing authentication techniques in the first place. Why wouldn't I want to use them?

Well, one reason is the people who know you can use this information, too—to your detriment.

One of the things we're seeing in Canada is children of the elderly using electronic banking to take over accounts and receive inheritances before their parents have died, before they're entitled. If my parents are in their 80s and have no idea about telephone and Internet banking, I can call and say I'm Mr. Wing Sr. and I'd like to activate my online banking. They'll say: Sure, what's your date of birth, mother's maiden name? I know all this stuff about my parents. There is a system-savvy, tech-savvy younger generation exploiting these authentication weaknesses.

Insiders will always have more information and more access though. How can you stop this?

What I push for is what you might call "opt out forever." There should be an opportunity for consumers to say, I never want to use a service, like online banking. And if "I" do use it, that is if someone activates it pretending to be me, you [the bank] are responsible. The fact that many services can be activated at any time is an issue. Two of the biggest sources of identity theft here in Canada, it turns out, are video stores and health clubs. You give them your driver's license and credit card and address and date of birth and your signature and all these minimum-wage employees have access to it behind the counter. If you stop using that store or club, do you think they destroy that information? No.

That's another principle that must be developed: destruction of credentials. Not only are they collecting information they don't need, but there's also no reason for many places to keep the information they keep.

What are some of the other principles you're advocating?

Choice of risk. Enterprises should give citizens choice of the strength of authentication they use. If I choose a four-digit PIN then that's my risk and my responsibility. If I want, I should be able to demand multifactor authentication. What we're not doing as enterprises consistently is giving consumers the choice of how they want to protect themselves. I don't know, but I suspect, given the money we spend on things like cell phones, that if you said for an extra dollar a week you can have better security through, say, a USB token, most people would say, Sure, that's worth it. I think consumers would pay for the cost of the extra stuff, for two- and three-factor authentication. When I buy a car I can decide the safety features I want. What do I want to pay for a higher crash rating?

Another thing which I now encourage my lecture audiences to do is to tell white lies. For example, I will change my DOB for someone I deem not worthy of that information. I have an algorithm for my "second date of birth." Part of the boomers' and elderly's ethic is we weren't allowed to question authority. So part of what I deal with as a privacy advocate is to tell people it's OK to say, No, you don't need that information, and that it's OK to tell a fib and give a date of birth that's not quite the truth.

That's the second time we've talked about boomers and the elderly. Are there generational forces at play regarding privacy?

My sense here is that people enjoy the convenience of the Internet to do things, including the boomers and elderly, but they're giving up on some things. They're not doing as much online because of security and privacy, but they still dabble in it. Having said that, there will be paradigm shift at some point, because the younger generation doesn't yet understand the privacy argument. I've seen statistics out of the University of Ottawa that a huge percentage of kids admitted to giving away personal information on the Internet. These are kids under 16. And, here's the scariest part: A large percentage admitted to giving away information about their parents.

Apple's iTunes support service asks for not just unchanging personal identifiers but also information seemingly unrelated to the request. In one instance I was asked for my iPod serial number to get support for iTunes' store. There seems to be a real lack of logic around what information companies ask for sometimes. Why is that?

Sometimes the system is designed by a techie copying how someone else before them did it. Again, we have no best practices or standards. Other times it goes through marketing, and they want that personal information. Then there are customer service people who are not allowed discretion when they're authenticating you. They just have to go through their script. They have to insist on my date of birth even if it's irrelevant. And it's almost always irrelevant. The reality is we're not seeing governance to control that because the role of the chief privacy officer is not well established in most organizations.

Most CPOs are complaint investigators, as opposed to being involved in the governance policies around the capture and destruction of data.

What are the chances, though, that those marketing departments are going to voluntarily stop asking for all this minable data if customers don't raise a fuss?

I agree the marketing machine has the power right now. But when people do get burned, then they start to become believers. And right now, lots of people are getting burned.

Like the minister of privacy in Canada.

Besides the fact that Canada has a minister of privacy and a privacy office while the U.S. does not, talk about some of the other differences in terms of privacy.

North of the border we treat ID theft differently. We don't treat credit card theft as ID theft. If someone masquerades as me to apply for credit in my name, that's ID theft. I was in Dallas recently and they were talking about an ID theft scam, and eventually I figured out it was just fraudulent checks. So that's confusing sometimes. We have two privacy laws here. PIPEDA [the Personal Information Protection and Electronic Documents Act, pronounced PIP-eh-dah], which governs citizens' privacy rights, and the Privacy Act, which focuses on government agencies' responsibilities around privacy. I would say Canada is certainly looking more progressive right now than the U.S. Transborder data flows have become a big issue here.

Transborder data flows?

Data that crosses from Canada into the United States for storage or processing. We get questions now: Does my data go across the border to get processed? The answer is usually no, but sometimes it does in, say, a disaster recovery scenario. Companies trying to comply with PIPEDA were not comfortable with that because of the USA Patriot Act. They couldn't for sure guarantee compliance with PIPEDA if data were in the U.S. and the Patriot Act allowed access to that data without informing anyone. There's a lot of tension there. So service providers are realizing this cross-border thing is an issue to the point that they're looking at moving backup and disaster recovery back to Canada or Europe. Anywhere but

the United States.

With an issue like privacy, many people often try to take a pragmatic approach to addressing the issue—trying to figure out, say, a return on investment for being a good corporate citizen with privacy. You seem to think more big-picture than practical.

Well, I believe that, in fact, doing right by privacy does eventually contribute to the bottom line and make you a better company; I also think it's bigger than that. I'm a member of the Association of Chartered Certified Accountants. They have something called the Sustainability Reporting Awards. They recognize companies for reporting and disclosure and for accountability. A lot of sustainability reporting is industrial and environmental in nature. You see some around child labor laws, but you don't see much around social-conscience issues like privacy rights. I've been working with these groups trying to figure out how we get privacy built into sustainability reporting. It's going to be a big differentiator.

Copyright © 2006 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)