Jeffrey Rosen and the Naked Crowd: Liberty and Surveillance for All

Law professor and author Jeffrey Rosen argues that, in theory, security can be done in ways that won't scuttle privacy and civil liberties. But that's a tall order, and he acknowledges that it may not play out that way in practice.

1 2 Page 2
Page 2 of 2

I guess it's an example of how a politician, by public opinion, may steer us toward choices that threaten privacy without increasing security and how important it is to be a little more heedless of public opinion if you're going to make a more thoughtful choice.We've seen some vigilance among citizensdonning Groucho Marx masks in the face of surveillance, for examplebut overall there's quite a bit of apathy toward something like the Patriot Act. Can you reconcile this?As in all aspects of public policy, Americans tend to be self-regarding. They become concerned when they imagine it could affect them in a personal way, and they're less concerned by abstract dangers down the road. In order to understand the dangers of the Patriot Act, you have to be able to talk with some technical specificity about why surveillance [technologies] that are acceptable to catch terrorists might be dangerous if they're used to catch lower-level criminals. That's not an intuitively obvious point.

By contrast, the Super Bowl is something that everyone gets, and if you see the cameras, you can feel immediately affronted and personally insulted, so it's an act of personal defiance, and that's why they put on the masks.What of the common argument among those who seem apathetic to measures like the Patriot Act that "I've done nothing wrong, so why should I care?"This is the crucial sentiment that privacy people have to confront: You've got nothing to hide, so you've got nothing to fear. Again, it requires asking people to think a few steps down the road. In Tampa, Fla. [where police tried out a face-recognition system in the Ybor City entertainment district], there was a database of supposedly suspected terrorists who were going to be used for face matching. But the faces in the database weren't suspected terrorists; instead they were lower-level criminals guilty of small forms of wrongdoing. The database wasn't regulated. It turns out there were no matches made because the technology is so ineffective that it had trouble distinguishing between men and women. Then there's the danger of the slippery slope. If we allow cameras, which are ineffective but pervasive, at the Super Bowl, wouldn't they spread to city centers? Ybor City found the cameras to be ineffective and dismantled them.

The other side, of course, is "Why are you so hysterical about these cameras? They'll get better. So what if they don't catch terrorists, they make us feel better. Why should innocent people mind?" One of the arguments I'm trying to make in the book is that there is a danger to feel-good technologies. They can distract us from responses to terrorism that really work. They give us an illusory sense of security, and they can indeed create an infrastructure of surveillance that, without too much imagination, could be linked in the future in ways that are hard to reconcile with American values.In your book you cite El Al, the Israeli airline, as a case of security effectiveness that has little to do with technology.I was moved by the El Al example. If you were looking for a single airline that managed to conquer terrorism, El Al is where you'd look, since they haven't had a [significant] attack since the 1970s. And I was struck by the fact that they don't rely on technology or data mining, but they engage in exhaustive training of agents with military precision. And these agents are trained to engage in extensive interrogations with passengers that can last up to 45 minutes. They don't rely on [database] profiles, but they make on-the-spot judgments. In short, they put most of their faith in human intelligence.

But there's a powerful American urge for a technological approach because we're enamored of technology and because it seems so democraticit surveys all without discrimination. Human intelligence requires trust in authority, in human discretion. These are all things that a culture suspicious of authority is reluctant to [credit]. Cultures with more of a tradition of deferring to government, like Israel or even Britain, may be better equipped to embrace discretionary human judgments that can actually catch terrorists.We've talked mostly about the government side of security versus liberty. What about the private sector's role?It's an important question, especially since nearly all of the technology and surveillance being applied in the national security sphere to catch terrorists was developed in the private sphere to classify and predict the behavior of customers on Amazon.

The whole point of consumer surveillance technology is to put people in different boxes based on their economic behavior. So if you're a good customer on Amazon, you get good customer service and you get in short lines and [enjoy] preferred status because of your perceived value to the company. By contrast, the American government isn't supposed to put people in different boxes based on their perceived value to the government. In the eyes of the government, everyone is supposed to be equal, and the government is supposed to earn the trust of citizens, not the other way around. The predictive software that was developed in the private sphere threatens ideals of equality when it's applied blindly in the public sphere. You could well imagine constitutional values being threatened when the only people who get to go through the fast line are the ones who are rich.Is the privacy debate just promarket Republicans versus antimarket Democrats?Privacy is one of the few genuinely cross-cutting ideological issues. And what divides the privacy team from the progovernment team isn't political affiliation but trust in government. Two of the heroes of my chapter about law are Dick Armey and Bob Barr, the former House majority leader and the former Clinton impeachment manager, who are more responsible for the elimination of some of the Patriot Act's excesses than anyone else in Congress. Similarly, the biggest defenders of expanded government power since 9/11 have not only been big government Democrats but also George Bush and John Ashcroft. Right now the people who are most upset about Ashcroft are not the Democrats but his former allies on the evangelical and libertarian right who are afraid that many of the powers that he's supporting will be used in the future by the attorney general for President Hillary Clinton. Privacy crosses ideological lines, and that's a happy coalition that we have to cultivate.What role can CSOs, as professionals who deal with security every day, play in the privacy versus security debate?CSOs have a real opportunity to help make thoughtful choices about designing some of these technologies in ways that strike a better balance. Even on a simple level, constructing an identification system for a company. You just face a host of choices about how much data to share, how much identifiable location information to retain, what type of potential storage for biometrics might be appropriate. These choices all will have dramatic consequences for how a system is used and whether it protects privacy or threatens it. Your readers are among the most technologically sophisticated policy-makers around. And within the constraints of their primary responsibility to the company above all, I hope that there's some freedom to make good choices that strike a balance between privacy and security.Certainly they know and understand the distinction between real and symbolic, feel-good security.Absolutely right. They're also in a better position than politicians to resist some of the more emotional excesses of public opinion and make responsible choices. They are not elected, and they don't have to pander to their employees. I end the book expressing skepticism about the idea that some trained corps of elites might be able to make some of these choices for us. That was Walter Lippmann's suggestion at the end of his classic book Public Opinion. I think it's even less plausible today, but to the degree we do have a corps of thoughtful elites that can make rational policy choices, weighing the costs and benefits of technologies, I think your readers are as close to it as we can find.Skepticism is woven throughout your book. Are you at all hopeful about balancing liberty and security?I was both surprised and heartened to focus on Congress as a bright light in this. I never would have expected in thinking this project through that Congress would be more dispassionate than the courts. It surprised me after 9/11, and it changed the way that I thought about the efficacy of relying on the courts. I was also surprised by my growing doubts about the public's ability to process these questions in a calm manner. The literature on risk management was largely new to me, and I was both excited and sobered that our reaction after 9/11 paralleled in many ways similar pre-9/11 reactions to public fears.

Dan Geer, former @Stake CTO, has posited that attitudes about privacy are generational and that current youth have grown up with a lower expectation of privacy. How will this affect the balancing act?

It's a very interesting observation, that in some ways I wish I explored more directly. I'm even struck, when I run scenarios by students, by how much less concerned they are about the hazards of exposure than I am. It makes intuitive sense because what we're talking about here are norms of reticence, manners that are deeply generational. People who grew up in a hierarchical society where you kept private matters to yourself will be more distressed by exposure than those who grow up in an unbuttoned world. If that observation is correct, then the hazards of exposure are very much the beginning of a series of social transformations whose consequences we can only begin to appreciate.

Copyright © 2004 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
7 hot cybersecurity trends (and 2 going cold)