One seemingly unshakeable truth about the online world since it began is this: The Internet never forgets. Once you post anything online, it is recoverable forever – the claims of former IRS official Lois Lerner about “lost” emails notwithstanding. Even promises of photos disappearing after a few seconds have been shown to be bogus.
But that doesn’t mean people won’t try to erase the Internet's memory. The latest effort comes from the Court of Justice of the European Union, which ruled in May that EU citizens have the “right to be forgotten,” meaning that they can ask search engines like Google, Yahoo and Microsoft’s Bing to remove links to their names that are “inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed."
Good luck with that. For starters, the ruling doesn’t mean the information has been eliminated from the Web entirely – just that it will be harder to find, since it won’t show up as a link on search engine.
But beyond that, critics contend that not only is the ruling unenforceable, it's also already having unintended consequences. Kashmir Hill, writing in Forbes recently, called it a “logistical nightmare,” in part because Google notifies media organizations when it removes one of their stories, “which leads the organization to re-earth the story and mention the incident that some European desperately wanted disappeared.”
Rebecca Herold, CEO of The Privacy Professor, noted that it is also, “far too easy for others to quickly make copies and save elsewhere. Information is also automatically cached, and often copied, on some sites, including most of the search engines,” she said.
Another logistical problem is that the EU court doesn’t have worldwide jurisdiction. The ruling only applies to the EU version of Google. If you don’t find what you’re looking for there, just switch to the U.S. version, or one from another region of the world.
Yet another difficulty is that, so far, the operators of the search engines themselves are the ones determining whether the requests are legitimate. Both Google and Microsoft require extensive documentation in a four-part form that demands proof of identity, proof of European residency, a list of any pseudonyms and an email address, and asks whether the person is a public figure or a local community leader.
It then requires a list of the pages the person wants blocked, along with the reasons for the request. The companies say they then make a decision by seeking to balance the right of privacy with public interest and free expression.
According to Hill, Google reported that over the course of two months, it had approved about half of 100,000 requests to have links erased. But the bottom line is that the company, not a disinterested court or perhaps a public-interest council, makes the decision.
That setup has drawn criticism from various directions. “The EU advocate general who advised the court against this decision foresaw that that it would be impossible for Google to judge millions of individual cases, each subject to nuance and particular facts,” said Jules Polonetsky, executive director of the Future of Privacy Forum.
“Who speaks for the researchers who are looking for that information? Who speaks for the publisher of the information?” he asked.
Herold agrees. “This subjectivity leads to inconsistent application of a poorly written – even with good intent – law,” she said. “It needs to be rewritten to clearly indicate the types of information that are candidates for removal, and the specific steps and standards that all entities – Google, Microsoft, Facebook, etc. – must follow to make their determination.”
Herold offered a few examples of information that would qualify for removal – “not an exhaustive list,” she said – such as sensitive personal information like Social Security or credit card numbers, false information posted by cyberbullies, or information posted by someone impersonating someone else.
Examples of things that would not qualify, she said, could include criminal convictions or historical and government information.
Creating a comprehensive standard for what gets de-linked, “will likely take some work and time,” she said, “but it is much better than supporting a vague and inconsistently applied law for an undetermined time.”
Polonetsky also noted that while a good case could be made for taking many stories down, “we won’t know until years later if a story about a certain individual is important, and by then it may be too late – the person may be the president or prime minister.”
Google cofounder Larry Page has argued that it raises another potential problem – that oppressive governments will use it to erase things they don’t like. Wikipedia founder Jimmy Wales contends that it amounts to “censoring history.”
Those arguments are a bit less compelling to privacy advocates. Herold said Wales, “should be more concerned about keeping the information on his site more accurate without the ongoing barrage of completely false information being made within it. Making sure history is accurate, as opposed to propagating bogus information, is not censoring, it is correcting and improving upon the quality [of the record],” she said.
And Polonetsky pointed out that “oppressive governments around the world have been seeking to block information they don’t like” since long before the EU court ruling.
Privacy advocates do agree with the general concept that while it is impossible to erase things completely from the Internet, government should make an effort to make it more difficult to find bogus information.
Whether there is a way to do that is an entirely different matter. David Meyer, writing in Yahoo Finance last week, noted that a British House of Lords committee had declared that the EU court’s ruling on the right to be forgotten is unworkable.
“It is no longer reasonable or even possible for the right to privacy to allow data subjects a right to remove links to data which are accurate and lawfully available,” the committee said.
Meyer wrote that no one so far has been able to resolve what he called a “fundamental paradox … the internet’s very nature pushes a certain transmit-it-all, retain-it-all ethos that is possibly impossible to stop, but that also flies in the face of privacy principles held very deeply by many countries and citizens.
“Somebody, please, find a realistic way to fix this,” he wrote, noting that the stakes are high – that people’s lives can be destroyed “by random comments they made online as kids.”
If there is a way to fix it, it will likely take more than government. Privacy advocates also agree that individuals need to take responsibility for their own privacy.
One way to do so, which is somewhat after the fact, is for people to turn themselves into a “virtual entity” – a technique profiled in CSO several years ago. Professional skip tracer Frank Ahearn said he helps clients open a corporation and conduct all their activities through that.
“Everything about you exists under the corporation,” he said. "The address doesn't have to be in the same city you're in. The goal is to make you virtual and have you communicate virtually through this corporation.”
Another technique, Ahearn said, is to “create confusion” with multiple, bogus identities. “We develop about 15 to 20 websites and create all these social media sites around you. Now if you are traveling somewhere and someone puts your name in, they are going to locate those 20 other people before they get to you.”
But even that, he and others agree, is less foolproof than simply being much more careful about what you put online. Theresa Payton, CEO of Fortalice and a former White House CIO and social media expert, said neither governments nor hired service providers can protect you if you don't take steps to protect yourself.
She offered a basic list that she said everyone should follow:
- Take sensitive information and conversations offline.
- Leverage two-factor authentication and encryption to add a layer of protection.
- Use communication tools that let you set expiration dates on your messages.
- Use features such as Google's Incognito browser.
- Abide by the “Grandma Rule" and the "Bad-Guy Rule.” Ask yourself, "Would I be embarrassed if Grandma saw this post?" And, "If a bad guy saw this, could he hurt me or my loved ones?" If yes, leave it offline.
But she agrees that people need help from government, since they don’t always have complete control over what is posted about them.
Privacy, she said, is not only about people making bad or careless choices. “Privacy is about personal choice,” she said. It's one thing to post your own “selfies” or ideas online. But “where people were exposed through someone else’s choice, they feel their privacy was violated.”