Europe’s ‘Right to Be Forgotten’: Privacy as Internet Censorship

by on January 23, 2012 · 7 comments

According to the BBC, the European Commission is apparently set to adopt formal rules guaranteeing a so-called “right to be forgotten” online.  As part of the Commission’s overhaul of the 1995 Data Protection Directive, this new regulation will mandate that, “people will be able to ask for data about them to be deleted and firms will have to comply unless there are ‘legitimate’ grounds to retain it,” the BBC reports.

I’ve written about “right to be forgotten” and “online eraser button” proposals before in my Forbes essay, “Erasing Our Past On The Internet,” a Mercatus white paper on “Kids, Privacy, Free Speech & the Internet: Finding the Right Balance.” and in this essay here on the TLF on “The Conflict Between a “Right to Be Forgotten” & Speech / Press Freedoms.” While I can appreciate the privacy and reputational concerns that lead to calls for such information controls, the reality is that a mandatory “right to be forgotten” is a recipe for massive Internet censorship.  As I noted in those earlier essays, such notions conflict violently with speech rights and press freedoms. Enshrining into law such expansive privacy norms places stricter limits on others’ rights to speak freely, or to collect and analyze information about others.

The ramifications for journalism are particularly troubling. Good reporting often requires being “nosy” while gathering facts. Journalists (and historians) might suddenly be subjected to restraints on their research and writing. The Brits have been struggling with this when trying to enforce gag orders and “super-injunctions” on media providers to protect privacy. It hasn’t turned out well, especially since new social media platforms and speakers easily evade these rules. (See my Forbes column, “With Freedom of Speech, The Technological Genie Is Out of the Bottle.”)

Thus, for a “right to be forgotten” to work, a more formal and robust information control regime will need to be devised to censor the Net and make it “forget”about the digital footprints we left online. Will the DMCA’s “notice and takedown” model be applied? Beyond the chilling effect associated with dragnet takedowns of online information, it’s unlikely that approach will really work. Keep in mind, this isn’t as simple as just telling large social media operators to delete information on demand. The reality is, as computer scientist Ben Adida notes in his essay “(Your) Information Wants to be Free,” the same forces and factors that complicate other forms of information control, such a copyright and speech restrictions, also complicate the protection of facts about you. “[I]nformation replication doesn’t discriminate: your personal data, credit cards and medical problems alike, also want to be free. Keeping it secret is really, really hard,” Adida correctly notes.

The fact is, information is instantaneously replicated online many times over on many different platforms — sometimes manually, sometimes automatically. Regulation will need to grapple with how to put the genie back in the bottle when countless others have already forwarded or commented on the piece of information someone later wants “forgotten.” And how would automated online archiving / storage services be affected? Will such sites and services be expected to find and purge every possible mention / reference of the offending information? Will they be compensated for the countless requests they receive to delete countless pieces of digital information, or are they just expected to do that out of the goodness of their hearts?

I could go on, but instead I’d just ask that you read some of the essays I’ve already cited and then take a look at this outstanding essay on “9 Reasons Why a ‘Right to be Forgotten’ is Really Wrong,” by Joris van Hoboken, a PhD candidate at the Institute for Information Law (IViR) at the University of Amsterdam. It’s an outstanding critique of the notion.

Please keep in mind: Just because I raise questions like these it does not mean I’m opposed to the notion that online operators should be held to higher standards and be expected to properly safeguard our online information and perhaps even delete much of it upon request. But moving this process into the legal / regulatory arena opens up a huge Pandora’s Box of potential problems. Censoring the Net — even when it’s for a cause many favor — is very hard and will give rise to many unintended consequences.

[P.S. Here’s a podcast conversation about these issues where Jerry Brito and I discuss the ramifications of such a regulatory regime.]

Previous post:

Next post: