Freedom of Information vs. Right to Privacy: A Long Game of Bras de Fer

It all started with the story of Mario Costeja Gonzalez from Spain after he sued and won against Google for the removal of links to some content from his past, still available on the internet years after.

On May 13, a right to be forgotten law was born in Europe after Gonzalez won the case. For the first time, Europeans can request that certain content not appear on the SERPs and have links removed.

A new Internet era has arrived – an era where consumers can control some of the content that marketers and others make available to the general public.

Where is this coming from?

With the Internet’s increased penetration globally, it has never been easier to look for and find relevant information about practically anyone or anything. Google has become the traffic cop leading to information for pretty much everything.

For example, when you’re looking to hire someone and need to check the candidate’s social network activities or when you have a business meeting and want to look at someone’s online profile, you will probably launch your search through Google to guide you to the most accurate information on the web.

Because of the diffusion speed of information on the web, without Google or another search engine it’s practically impossible to control or even to know where certain content can be found. In addition, there is certain information and many pieces of content that people would rather not advertise.

Google’s algorithm controls what information appears on a resulting search page after a specific query has been typed. Often, what first comes up on a SERP may not be what you expect. And yet this is out of anyone’s control. It’s obvious that, at least to me, some people strongly prefer not to have certain information made publicly available.

So what was the public reaction to this new law?

In Europe, the public reaction was immediate. It was positive and strong. Some 12,000 requests to remove links to specific information were sent to Google on the first four days after the request forms became available online.

It has been reported that 12 percent of the initial applications related to pedophilia, 30 percent to fraud, and 20 percent to arrests or convictions. Perhaps this kind of information shouldn’t be hidden for the well-being of everyone.

That said, this is a big, first win for pro personal data protection partisans. Nonetheless, does this violate the freedom of information and the right for anyone to access the most accurate information?

How did Google react?

Google, as usual, reacted fast by providing an online form for link removal requests. A dedicated team was built on the fly and is working on every request. At this time, every online request is being carefully reviewed and evaluated.

Google also set up a committee, including CEO Larry Page, to address this topic, develop an effective strategy, manage it, and report on progress. The ironic thing is that that there is no European representative on this committee. Where is this law coming from again?

What does this really mean?

At this point, not much. We need to wait and see if other cases are won and how Google continues handling the requests. The general public may think that by filling out a form to remove some links they can be forgotten, which is not entirely true.

Under the current guidelines, links won’t be deleted from the Internet entirely. The content will be hidden and won’t appear in search results in Europe only. Information will still be available outside of Europe, so in essence it will not be completely hidden.

In addition, the Court of Justice of the European Union has ruled that “Individuals have a right to control their private data, especially if they are not public figures;” and “Links to irrelevant and outdated data should be erased on request.”₁

That said, it’s just not that easy to get approval. Information will only be removed if “the impact on the individual’s privacy is greater than the public’s right to find it.”  This is quite subjective and makes me wonder how these requests will be evaluated.

How Google will decide what request should be accepted or rejected. How is “outdated” defined? There is still work to be done to determine which factors should influence the decision, and additional criteria needs to be established to accept or reject various requests.

Will this new law change the way we look at search results and how accurate will this be?

As an increasingly number of requests are approved and some searches are hidden from the SERPs, this has the potential to totally change the way we look at information on the Internet – at least in Europe. The web is the place to find fresh and accurate content on anything. If some links to specific content are removed, how trusted can it be?

As of now, this rule targets individual personal rights. Potentially, it could be expanded to companies and brand marketers, who want to hide sensitive developments like product recall, fiscal fraud, and other delicate issues. This strikes me as the beginning of an awful “Bras de Fer” between user rights and protectors of freedom of information.

In any case, one thing is clear: “The right to be forgotten cannot amount to a right of the total erasure of history.”₂

In short, although some content is hidden, it is still there on the Internet in other places. So can we take a step back at any moment and see the content again?

Finally, there are other ways to combat negative search page listings. We could apply the best practices for content marketing in this situation. Similar strategies we use on behalf of our brand marketing clients at Covario to achieve greater visibility for targeted search results can also be used to help erase or bury some bad history. One can participate actively within social spaces to help suppress negative content. Volunteering and doing good deeds can also reverse the trend. Beyond these recommendations, there are more ideas that I’ll save for a future blog post.

Photo credit: