Big data and relinquishing your right to privacy

Now more than ever, it’s time for us speak up about our right to privacy.

privacy
Gerd Altmann (CCO Public Domain)

There’s little doubt that we’re living in the most technologically advanced era in history. All through time, new innovations have brought with them convenience and better quality of life, but too often they also resulted in a fresh look at the ways that innovation could be used for harm.  

Take dynamite, for example. The American westward expansion meant the construction of a cross-country railroad, but the work of building it was deadly. Swedish scientist Alfred Nobel invented dynamite as a cost effective way to safely blast through mountains, which allowed for the building of the railroad without as much loss of human life. His creation was later turned into a weapon, which prompted him to leave his fortune to the establishment of the famous prizes for achievements in science, math, the arts, and of course, peace.

Things like home AI-driven personal assistants, fitness wearables, and even sleep tracking apps aren’t vital to the fabric of society, and they’re certainly not on par with life-changing tools like Nobel’s dynamite. But they are fun, useful, convenient devices that can improve our everyday lives… except when they lead to a loss of privacy.

I’ve written before about my own family’s experience with an Amazon Echo device, and how it made us rethink the privacy that we blindly give up without even thinking about it. There has already been one privacy lawsuit due to a criminal case in which prosecutors believed key evidence had been gathered by the defendant’s Echo. More recently, Burger King faced consumer backlash over a marketing campaign in which the actor in a television commercial intentionally activated the viewers’ Google Home devices. Google was less impressed than the consumers, rest assured.

A new consumer goods vs privacy lawsuit has now been filed against audio technology company that manufactures high-dollar sound systems, headphones, and more. The plaintiff claims that the smartphone app which lets you select music, audiobooks, podcasts and more actually gathers your listening preferences and sells that information to third parties, including data mining company. On the surface, who cares if an advertiser knows what kind of music you like to listen to when you wake up, or when you get off work? Wouldn’t it be nice for advertisers to present you with offers for things you actually want, rather than random selections?

In the case of this lawsuit, however, the plaintiff and his attorneys are far more concerned about the other content consumers listen to, like podcasts, for example. Is it possible to create a profile about someone based on their music preferences, news sources, and favorite podcasts? Could someone potentially piece together the listener’s religious views, political leanings, or even sexual orientation from the content they choose to consume?

This lawsuit brings up an important issue, one that all consumers take for granted, and that’s the terms and conditions for using the product. After all, no one forced the public to install Amazon Echo devices in their homes, wear luxury headphones, or download the latest fitness app and agree to have their locations tracked; those were just the “natural consequences” of purchasing and using the device.

But there’s an important issue at stake: what happens when the product isn’t voluntary, as in the case of India’s new compulsory identification system that uses everything from name and address to biometric markers? That’s a stark example, but in the case of the audio technology lawsuit, the plaintiff’s attorneys contend that not only can consumers not see the terms and conditions at signup, they also don’t say anything about gathering and selling your preferences. In a similar privacy situation, Uber has faced scrutiny for news that it developed “fingerprinting” capabilities that allowed it to track users’ iPhones even after the app was deleted, and that it was able to track users’ locations even when they were not using the service. Is it really so far-fetched to compare forced biological identification of all citizens to not telling consumers what information is gathered or how it’s used?

At the heart of the privacy debate are the “unspoken” rules about what companies can do with our data. Even when we know that our activity and information and even our voices are being recorded and stored, what obligation does a company have to tell us every single example of how it can be used? As consumers, we might not mind if our listening preferences are used to advertise related goods or services, but do we have to agree to every possible use of information—both positive and negative—as an unavoidable part of data gathering?

The bigger concern is why any company would think it’s OK to not inform its customers of the rights they’re signing away. After all, checking the box that you’ve read the full agreement has been called “the biggest lie on the internet.” It’s alarming to think that we have already adopted a cultural mindset that privacy is just something we sacrifice to make sure we have a ride to the airport, or to turn our lights on when we’re late getting home. If the agreement we entered into by checking that one tiny box—whether it’s completely truthful or not—is akin to signing away our right to privacy, it’s time for us as consumers and advocates to speak up, learn the ramifications, and ultimately, vote with our dollars.

This article is published as part of the IDG Contributor Network. Want to Join?

NEW! Download the Winter 2018 issue of Security Smart