Europe Reimagines Orwell’s Memory Hole

by on November 16, 2010 · 7 comments

Inspired by thoughtful pieces by Mike Masnick on Techdirt and L. Gordon Crovitz’s column yesterday in The Wall Street Journal, I wrote a perspective piece this morning for CNET regarding the European Commission’s recently proposed “right to be forgotten.”

A Nov. 4th report promises new legislation next year “clarifying” this right under EU law, suggesting not only that the Commission thinks it’s a good idea but, even more surprising, that it already exists under the landmark 1995 Privacy Directive.

What is the “right to be forgotten”?  The report is cryptic and awkward on this important point, describing “the so-called ‘right to be forgotten’, i.e. the right of individuals to have their data no longer processed and deleted when they [that is, the data] are no longer needed for legitimate purposes.”

The devil, of course, will be in the forthcoming details.  But it’s important to understand that under current EU law, the phrase “their data” doesn’t just mean information a user supplies to a website, social network, or email host.  Any information that refers to or identifies an individual is considered private information under the control of the person to whom it refers.  So “their data” means anyone’s data, even if the individual identified had nothing to do with its collection or storage.

And EU law doesn’t just limit privacy protections to computer data. Users have the right to control information about them appearing in printed and other analog formats as well.

As I say in the piece, the “right to be forgotten” begins to sound like Big Brother’s “memory hole” in Orwell’s classic 1984.  But instead of Winston Smith “rectifying” newspaper articles at the direction of his faceless masters at the Ministry of Truth, a right to be forgotten creates a kind of personal memory hole.  Something you did in the past that you would prefer never happened?  Just issue orders to anyone who knows about, and force them to destroy any evidence.

Of course such a right would be as impractical to enforce as it is ill-conceived to grant.

Both Masnick and Crovitz, in particular, worry about the free speech implications of such a right, both for the press and for individuals.  And those are indeed potentially catastrophic.  Having the power to rewrite history devalues any information, including information that hasn’t been erased.

The social contract operates on facts and the ability to sort out truth from lie.  A right to be forgotten gives every individual the power to rewrite that contract whenever they feel like.  So who would sensibly enter into such a relationship in the first place?

My concern, however, is even more metaphysical.  The privacy debate currently going on in public policy circles is disturbing, perhaps most of all because it is being framed as a policy discussion.  Rather than work out what costs and benefits we get from increased information sharing with each other, those who are feeling anxious about the pace of change in digital life are running, as anxious people often do, to regulators, demanding they do something—anything—to alleviate their future shock.  And regulators, who are pretty anxious people themselves, are too-often happy to oblige, even when they understand neither the technology nor the implications of their lawmaking.

Beyond the worst possible choice of forum to begin a conversation, the privacy debate in its current form is no debate at all.  It is mostly a bunch of emotional people hurling rhetorical platitudes at each other, trading the worst-case examples of the deadly potential of privacy invasions (teen suicides, evil corporations) with fear-inspiring claims of the risk of keeping information secret (terrorists win).

It’s not really a debate at all when the two “sides” are talking about entirely different subjects.  And when no one’s really listening anyway. All that is happening is that the stress level amps up, and those not participating in the discussion get the distinct impression that the world is about to end.

A starting point for a real conversation about privacy—one that is dangerously absent from any of the current lawmaking efforts—is an understanding about the nature of information.  Privacy in general and a right to be forgotten specifically begins with the false assumption that information (private or otherwise) is a kind of property, a discrete, physical item that can be controlled, owned, traded, used up, and destroyed.  (Both “sides” have fallen into this trap, and can’t seem to get out.)

The fight often breaks down into questions of entitlement—who initially owns the information that refers to me?  The person who found it and translated it into a form that could be accessed by others, or the person to whom it refers, regardless of source?  Under what conditions can it be transferred?  Does the individual maintain a universal and inalienable right of rescission—the ability to take it back later, for any reason, and without compensating the person who now has it?

But these are the wrong questions to be asking in the first place.  Information isn’t property, at least not as understood by our industrial-age legal system or popular metaphors of ownership.  Information, from an economic standpoint, is a virtual good.  It can be “possessed” and used by everyone at the same time.  It can become more valuable in being combined with other information.  It can maintain or improve its value forever.

And, whether the law says so or not, it can’t be repossessed, put back in the safety deposit box, buried at sea, or “devoured by the flames” like the old newspaper articles Winston Smith rewrites when the truth turns out to be inconvenient to the past.  That of course was Orwell’s point.  You can send down the memory hole the newspaper that reported Big Brother’s promise of increased chocolate rations, but people still remember that he said it.  You can try to brainwash them, too, and limit their choice of language to eliminate the possibility of unsanctioned thoughts.  You can destroy the individual who rebels against such efforts.

But it still doesn’t work.  The facts, warts and all, are still there, even when their continued existence is subjectively embarrassing to an individual.  Believe me, I wish sometimes it were otherwise.  I would very much like to “rectify” high school, or my parents, or the recent death of my beloved dog.  The truth often hurts.

But burning all the libraries and erasing all the bits in the world doesn’t change the facts.  It just makes them harder to access.  And that makes it harder to learn anything from them.

Maybe the European Commission was just being sloppy in its choice of words.  Perhaps it has something much more limited in mind for a “right to be forgotten.”  Or perhaps as it begins the ugly process of writing actual directives that must then be implemented in law by member countries, it will see both the impossibility and danger of going down this path.

Perhaps they’ll then pretend they never actually promised to “clarify” such a right in the first place.

But we’ll all know that they did.  For whatever it’s worth.

  • http://twitter.com/RoyHugo Hugo Roy

    One keyword that’s missing in this article is the word “publicized” (I mean, the fact to “publish” something, as in “publier” in French). That’s the real problem of this proposal. When people publish things, then it goes beyond the question of data. It is about expression; freedom of expression already has restrictions (in France, it is not unlimited as in the US First admendment) or consequences to deal with (libel, etc.). A “right to be forgotten” is not a necessity here, we all agree.But if we stay on the problem of data that are not published by people (and thus raise the question of “privacy” as in private, personal information) then I see a legitimate right for people to control what data others have collected on them (even with their approval). And there, I concur with Sam Tuke’s concerns.Above that, for some background: this idea takes its recent origins from France, where the state secretary for “Digital Economy” came up with this idea of “droit à l’oubli”. The real problem is that it was so badly built, that it would eventually lead to censorship and, indeed, a memory hole. Because the distinction between what’s published by people and what’s not, was missing.

  • Pingback: Initial Thoughts about the Markey-Barton ‘Do Not Track Kids’ Bill

  • Pingback: wRDwgC1HqM wRDwgC1HqM

  • Pingback: lainaa

  • Pingback: book of ra deluxe

  • Pingback: book of ra deluxe

  • Pingback: Cleaners Services

Previous post:

Next post: