Take time to think about security amidst the greatest gadget show on the planet – CES

Let us put our thinking caps back and ask the right questions.

virtual eye / digital surveillance, privacy / artificial intelligence / machine learning
Vijay Patel / Getty Images

Undoubtedly, the biggest, boldest and most alluring consumer show on the planet – the annual Consumer Electronics Show (CES) – is happening this week in Las Vegas. And even a non-earthling can hazard an educated guess that every category of device – home, health, transportation, hygiene, fitness, entertainment, sports, etc. – is going to have a few of the same common attributes.

Everything is going to be connected to the Internet, claim to have some form of AI, an app that screams for attention, integration with a smart assistant like Alexa or Google Home, integrations with third-party connected devices and platforms, and claim to make you better, faster, stronger…you get the picture.

Here’s one more similarity most of these gadgets will have, and it’s one that should scare us all, consumers and manufacturers alike: a blind eye when it comes to defining and defending consumers’ rights.

But to begin with, let’s delve one level deeper into what all these devices are doing:

Data

The underlying currency that makes everything at CES tick. Every device is mining for data – from intrusive ones like pacemakers that know your innermost heartbeat to more subtle offerings, such as fashionable eyewear that searches for every eyebrow twitch and gleam of sweat on the forehead on an unsuspecting observer, making predictions (right or wrong) about their mental equilibrium, social status, financial status, etc.

Predictions

This is where the more advanced vendors have an upper hand. The more data you have, the better your training algorithms and the more accurate your predictions. But better predictions beg for more data –so mining consumers for data becomes a staple diet. And poor predictions can have some pretty unfortunate ramifications, such as Google returning images of African-Americans for searches on words like “gorilla” and “chimp.”

Action

This is the manifestation of the prediction. An insulin injection instructed to pump more insulin after detecting a precipitous drop in real-time blood sugar levels, for instance. Or a treadmill that slows down after detecting the runner is dehydrated and might pass out.

But it all starts with data. Lots of it. And that data needs to be stored somewhere. And protected. And unnecessary data disposed of before it becomes a liability. And consumers made aware – in simple and comprehensible language – as to how this data is being collected. And how they can demand it to be handed over. Or destroyed.

That should be making headlines at CES. It is not. And that needs to change, starting with the manufacturers (although consumer education needs to happen, too).

How?

The CISO and her team need to start exercising more muscle when it comes to product decisions. Imagine if the CISO’s team at iRobot had exercised judgment and prevented the product team from introducing the feature that allowed the Roomba vacuum to collect detailed home floor maps and send it to the iRobot cloud? (And then share that data with 3rd parties, all unbeknownst to the customer!)

Create a data collection timer that expires after a default time and can only be renewed with the expressly stated purpose. Easier said than done? Not really.

The technology to timestamp data collection is already there. Ditto for expiration. Is there a scale problem if the device collects petabytes of data every single day? Absolutely. By enforcing data expiration timers, this would enforce a discipline to only collect data that is absolutely needed. Collect only data that is absolutely needed and destroy data that was collected previously that does not serve any need. Data Minimization. Magically, this could also be a service to the consumer – data portability and destruction. General Data Protection Regulations (GDPR) demands this anyway.

And finally, as a CISO or Chief Privacy Officer or Chief Ethics Officer, run a periodic fictional exercise of randomly selecting a few data stores…assuming, of course, you know where your data is stored in the first place, and this doesn’t become an eye-opening exercise.

Pretend the data has been hacked and made public. How would you react? This might be hard to do the first go-round. But the best time to do probably isn’t when you’re also responding to a New York Times expose and the threat of shareholder litigation. The goal is to prove to yourself and your peers how prepared as an organization you are (or aren’t) to deal with unsavory cybersecurity issues.

Are these three silver bullets – getting involved with the product decisions, data expiration minimization principles and tabletop exercises – enough to make you sleep well at night? Well, it’s certainly going to make you sleep better than organizations that aren’t doing it. And hasn’t security always been about the theory of relativity? Making your house more secure than your neighbor’s automatically makes his more vulnerable. Applied to digital assets, this is the cybersecurity theory of relativity.

Before #CES2019 winds down, I’m optimistic there’s going to be more talk (and walk) about consumer privacy, enterprise ethics and transparent data collection. While truly thinking and feeling cars and cognitive showerheads may dominate the hallways, I’m hopeful this existential conversation is happening simultaneously amidst the din, lights and revelry.

This article is published as part of the IDG Contributor Network. Want to Join?

Get the best of CSO ... delivered. Sign up for our FREE email newsletters!