What Pepsi's failed ad can teach us about data privacy

Better design and planning would have prevented the Pepsi ad debacle. Those principles also help organizations protect sensitive data.

controlling privacy
Thinkstock

By now, you've probably seen the ad that Pepsi released to the world and then quickly withdrew when it became obvious how tone deaf it was.

I don't have anything to say about the ad that hasn't been said already, but I do want to examine the conditions that led to an ad of such obliviousness to be released. Why? Because Pepsi’s failed attempt to promote itself may have some lessons for those anxious to keep their company from experiencing a similar calamity when it comes to the release of personal data. I’d like to use this post to explore how the Privacy By Design approach to improving organizational awareness about data protection may offer a way to avoid such pitfalls.

To me, it’s clear that Pepsi didn't want to offend (nor did Kendall Jenner for that matter). They merely wanted to create an ad that

  1. Sold product
  2. Entertained
  3. Aligned the brand with both the progressive intentions of youthful protestors and the benevolent public-mindedness of the police
  4. Positioned their product, Pepsi, as the glue that bound these opposing forces together

Noble goals, as commercial goals go.

But it all went off the rails, as we know. Somehow, Pepsi neglected to include an essential ingredient in their development process. They seem to have left out a means of validating their assumptions and avoiding oblivious and offensive stereotyping (or whatever other flavor of offense you happen to attach to the ad; there are many).

To this outsider, it looks as though the Pepsi ad folks somehow disregarded the potential negative impact of the way they depicted, well, everybody in the commercial. On the way to selling product, they ignored the pitfalls of sugarcoating an anti-corporate protest or too readily associating social protest with consumerism.

The data privacy connection

Now think how such a process might play out in the data privacy space. It’s not hard to imagine, for example, that a team eager to use data to advance their efforts, and to make it easy for people to share data, could stumble into the trap of inadequate data protection. The path to poor decisions, on ads and on data, is all too easy to follow.

And it’s all too plausible that one might take that path: There are a lot of well-meaning people in a wide range of businesses who might find themselves in a similar position with respect to the way they handle personal information gathered from consumers, customers or even employees. Those who design software products and web services that exchange personal information for access to a service or content come most readily to mind. But there are so many other businesses that traffic in personal information that the potential for harm extends to a huge number of organizations.

At every stage in the design and delivery of a product or service that collects or uses personal information of any kind, there is a decision that must be made on behalf of the customer, employee or data subject (just as there should have been during the design of that infamous ad). Questions such as: Do we privilege the protection and privacy of their information, or do we assume that we understand their intentions and needs better than they do? Do we collect the absolute minimum of data possible, even when we know that we might have to come back and ask for more later, or do we ask for more, assuming we will provide them something valuable in return for their data? Do we prevent others from accessing the data, or do we provide or sell it to others who we deem to have congruent interests to our data subject?

To learn from Pepsi, let’s examine an imaginary team of people working on a promotional offer that would include the collection of data.

This team might have asked for more data than it needed, expecting to market to these users later for other reasons. It might have neglected to get consent to share that data and then shared that data with partners who had nothing to do with the original brand. And it might have failed to offer the data subjects an ability to opt out of future marketing. All actions that raise data privacy red flags.

It's this kind of team whose sloppy data protection practices might lead directly to a data breach. It's also this kind of team whose actions might lead in some secondary way to data loss, either through accidental disclosure, malware infection or one of the many other vectors of attack opened when people fail to think systematically about the data that flows through every organization.

Privacy by Design

But expose this same team to the principles of Privacy by Design, and you suddenly have a very different effort. This same team considers from the start the implications of gathering consumer data, and from the start they seek to minimize the data that they collect. They offer clear and explicit options for the data subject to decide if they want additional marketing, and they offer equally clear access to remove themselves from future communications altogether.

When a team starts from the presumption that personal data should be afforded special protection, it also tends to gravitate to a generally protective stance toward data—the kind of stance that mitigates against data loss in general. If only Pepsi had started from a similar presumption about the social impacts of its ad.

What’s the lesson here? What can a poorly conceived and tone-deaf ad for a sugary drink teach us about data privacy? As data privacy and/or infosec professionals, we need to commit ourselves, across every area of the organization, to core privacy principles embedded in methodologies like Privacy by Design. An organization-wide approach like this is our best bet for protecting sensitive data from start to finish.

This article is published as part of the IDG Contributor Network. Want to Join?

New! Download the State of Cybercrime 2017 report