GDPR is more important than ever: The Cambridge Analytica-Facebook meltdown

Why its critical – existential in fact – for us to do our own “analytica” in this day and age.

donald trump
Gage Skidmore (CC BY-SA 2.0)

If you have not grasped the extent of this fiasco, it could be because it was late evening in the U.S. on Friday last and the weekend has dawned in most other parts of the world when this new broke.

In a nutshell, a Facebook app developed by one Dr. Aleksandr Kogan called “thisisyourdigitallife” had obtained user data through that app (which was originally built in 2014) by scraping data from the profiles of people who took the quiz as well as that of their friends—apparently allowed under Facebook’s policy for third-party apps at the time!!

This is the scariest part: Although only about 270,000 people took the survey, the New York Times reports, yet that Dr. Kogan was able to obtain data on 50 million users, probably through connections and other means!

And per the report, Cambridge Analytica paid to acquire the personal information from the Doctor, who Facebook says, claimed to be collecting it for academic purposes. After initially denying the claims, Facebook posted a statement acknowledging the breach and promising to take action.

 I am not going to delve into politics here (which is what this data was ultimately used for micro-targeting and influencing the electorate). There is a lot more at stake here—subversion, ethics and privacy.

Let us do quick post-mortem here on the mechanics of this breach, the impact on what it means to a platform vendor and ultimately what you and I as consumers can and must do going forward

Mechanics: A third-party app on Facebook developed by a purported researcher allows it extensive access to more private information than otherwise permissible to regular 3rd party apps. And this access balloons beyond the people who actually used the app—reportedly 270,000—to hit a massive 50 million giving the app unfettered access to a treasure trove of sensitive information. This data is then handed over to another firm—Cambridge Analytica—for a “fee” and this firm uses this to do micro-targeting. Finally, Facebook banned this firm from its platform after learning that the firm did not completely delete the data after it discovered that it had all this sensitive information.

Platform vendors’ conundrum: As a platform vendor, the stakes to protect data have never been higher. Yet, the average breach size and impact keeps growing. Why? Let’s dig in. Facebook, on its part has to make its platform attractive to 3rd party developers and its richest asset is you and me—our data—and making this available to 3rd party apps means more developers building on its platform and more stickiness for its platform and more targeted ads implying higher revenue.

Seems pretty obvious isn’t it? The friction here is that they must have a much higher bar given how impactful such a breach can be and can actually sway minds and hearts to make or break election results. But this is not a Facebook issue. Every platform vendor has this same dilemma. The need to drive revenue and monetize their data against the need to protect the data. And we should also expect the typical reaction sequence that Facebook had when such a breach is exposed—Ignorance followed by Denial followed by Outrage.

Where does that leave the consumer: This is where GDPR comes in. The first human comprehensible set of actionable rights that is accorded to you and me. The most well-known of this is “the right to be forgotten.” You can request any large service provider in the world (who has any connection with the EU whatsoever which is everyone) to obliterate your data forever and they must oblige. Or you can request your data to be handed to you in a “portable” format that you can take with you.

Beyond GDPR there is more that the consumer needs to take control of. In the case of Facebook, this is limiting what 3rd party apps have access to. And this can be confusing with apps constantly “complaining” that they will not work properly without access to body sensors, contacts or the camera. And the user needs to ultimately start with a point of zero-trust—turn off all access—and then test for themselves how the app behaves and then gradually turn on permissions as needed.

Sound difficult? It is not in reality but hitting the easy button will have consequences of the “analytica” kind. And then we will act outraged when it happens.

We are in a journey where the privacy boundaries are going to be constantly tested. Expecting the platform vendors to suddenly start doing the “morally” right thing is too naïve. Consumers need to be savvier and assume extreme ownership of their own data. GDPR provides the framework, it is our duty to exercise it.

Copyright © 2018 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)