Is Privacy an Unalienable Right? The Problem with Privacy Paternalism

by on January 27, 2014 · 1 comment

Last week, it was my great pleasure to be invited on NPR’s “On Point with Tom Ashbrook,” to debate Jeffrey Rosen, a leading privacy scholar and the president and chief executive of the National Constitution Center. In an editorial in the previous Sunday’s New York Times (“Madison’s Privacy Blind Spot”), Rosen proposed “constitutional amendment to prohibit unreasonable searches and seizures of our persons and electronic effects, whether by the government or by private corporations like Google and AT&T.” He said his proposed amendment would limit “outrageous and unreasonable” collection practices and would even disallow consumers from sharing their personal information with private actors even if they saw an advantage in doing so.

I responded to Rosen’s proposal in an essay posted on the IAPP Privacy Perspectives blog, “Do We Need A Constitutional Amendment Restricting Private-Sector Data Collection?” In my essay, I argued that there are several legal, economic, and practical problems with Rosen’s proposal. You can head over to the IAPP blog to read my entire response but the gist of it is that “a constitutional amendment [governing private data collection] would be too sweeping in effect and that better alternatives exist to deal with the privacy concerns he identifies.” There are very good reasons we treat public and private actors differently under the law and there “are all far more practical and less-restrictive steps that can be taken without resorting to the sort of constitutional sledgehammer that Jeff Rosen favors. We can protect privacy without rewriting the Constitution or upending the information economy,” I concluded.

But I wanted to elaborate on one particular thing I found particularly interesting about Rosen’s comments when we were on NPR together. During the show, Rosen kept stressing how we needed to adopt a more European construction of privacy as “dignity rights” and he even said his proposed privacy amendment would even disallow individuals from surrendering their private data or their privacy because he viewed these rights as “unalienable.” In other words, from Rosen’s perspective, privacy pretty much trumps everything, even if you want to trade it off against other values. 

Privacy Paternalism?

I’ve been seeing more and more privacy advocates and scholars adopt this attitude, including Anita Allen, Julie Cohen, Siva Vaidhyanathan, and others. Allen, for example, says that privacy is such a “foundational” human right that it some cases the law should override individual choice when consumers act against their own privacy interests. Cohen and Vaidhyanathan make similar arguments in their recent books. Vaidhyanathan claims that consumers are being tricked by the “smokescreen” of “free” online services and “freedom of choice.” Although he admits that no one is forced to use online services and that consumers are also able to opt-out of most of services or data collection practices, he argues that “such choices mean very little” because “the design of the system rigs it in favor of the interests of the company and against the interests of users.” “Celebrating freedom and user autonomy is one of the great rhetorical ploys of the global information economy,” he says.“We are conditioned to believe that having more choices–empty though they may be–is the very essence of human freedom. But meaningful freedom implies real control over the conditions of one’s life.” These are the sort of arguments I increasingly hear made by privacy scholars when claiming that consumers simply can’t be left free to make choices for themselves in this regard. 

In an interesting recent article in the Harvard Law Review, privacy scholar Daniel Solove notes that what binds these thinkers and their work together is, in essence, a sort of privacy paternalism. The point of most modern privacy advocacy has been to better empower consumers to make privacy decisions for themselves. But, Solove notes, “the implication [of these privacy scholar’s work] is that the law must override individual consent in certain instances.” Yet, if that choice is taken away from us by law, Solove notes, then privacy regulation, “risks becoming too paternalistic. Regulation that sidesteps consent denies people the freedom to make choices,” Solove argues.

Jeff Rosen now appears to be adopting the sort of approach Solove identifies by claiming that privacy is an “unalienable right” such that it cannot be traded away for other things. By making that choice for us, Rosen’s proposed amendment would, therefore, suffer from that same sort of privacy paternalism Solove identifies. In a forthcoming law review aritcle that will appear in the Maine Law Review, I identify some of the problems associated with privacy paternalism. Most obviously, these scholars should keep in mind that not everyone shares the same privacy values as they do and that many of us will voluntarily trade some of our data for the innovative information services and devices that we desire. If imposed in the form of legal sanctions, privacy paternalism would open the door to almost boundless controls on the activities of both producers and consumers of digital services, potentially limiting future innovations in this space.

For example, when we were on NPR together, Rosen mentioned wireless geolocation technology as a potential source of serious privacy harm, although he did not make it clear whether he wanted it stopped entirely or what. If used improperly, wireless geolocation technology certainly can raise serious privacy concerns. But wireless geolocation technology is also what powers the mapping and traffic services that most of us now take for granted. Many of us expect — no, we demand — that our digital devices be able to give us real-time mapping and traffic notification capabilities. And most of us are willing to make the minor privacy trade-off associated with sharing our location constantly in exchange for the right to receive these services, which are also provided to us free of charge.

So, what would Rosen’s proposed amendment have to say about this trade-off? Would these wireless geolocation technologies be banned altogether, even if consumers desire them? It isn’t really clear at this point because he hasn’t offered us many details about his proposal. But, to the extent it would preempt these technological capabilities on the grounds that our locational privacy is somehow in unalienable right, then that seems like a fairly paternalistic approach to policy and it it would seem to confirm Thomas Lenard and Paul Rubin’s claim that “many of the privacy advocates and writers on the subject do not trust the consumers for whom they purport to advocate.”

Such paternalism is particularly problematic in this case since privacy is such a highly subjective value and one that evolves over time. As Solove notes, “the correct choices regarding privacy and data use are not always clear. For example, although extensive self-exposure can have disastrous consequences, many people use social media successfully and productively.” Privacy norms and ethics are changing faster than ever today. One day’s “creepy” tool or service is often the next day’s “killer app.”

Balancing Values; Considering Costs

As I will discuss in my forthcoming Maine Law Review article and I also discussed in my recent George Mason University Law Review article, at least here in the United States, consumer protection standards have traditionally depended on a clear showing of actual, not prospective or hypothetical, harm. In some cases, when the potential harm associated with a particular practice or technology is extreme in character and poses a direct threat to physical well-being, law has preempted the general presumption that ongoing experimentation and innovation should be allowed by default. But these are extremely rare scenarios, at least as it pertains to privacy concerns under American law, and they mostly involved health and safety measures aimed at preemptively avoiding catastrophic harm to individual or environmental well-being. In the vast majority of other cases, our culture has not accepted that paternalistic idea that law must “save us from ourselves” (i.e., our own irrationality or mistakes). As Solove notes in his recent essay, “People make decisions all the time that are not in their best interests. People relinquish rights and take bad risks, and the law often does not stop them.”

Sometimes privacy advocates also ignore the costs of preemptive policy action and don’t bother conducting a serious review of the potential costs of their regulatory proposals. As a result, preemptive policy action is almost always the preferred remedy to any alleged harm. “By limiting or conditioning the collection of information, regulators can limit market manipulation at the activity level,” Ryan Calo argues in a recent paper. “We could imagine the government fashioning a rule — perhaps inadvisable for other reasons―that limits the collection of information about consumers in order to reduce asymmetries of information.” [*Clarification: In a comment down below and a subsequent Twitter exchange, Ryan clarifies that he ultimately does not come down in favor of such a rule, preferring instead to find various other incentives to solve these problems. I thank him for this clarification — and definitely welcome it! — although I found his position somewhat murky after debating him personally on these issues recently. Nonetheless, I apologize if I mischaracterized his position in any way here.]

Unfortunately, Professor Calo does not fully consider the corresponding cost of such regulatory proposals in calling for the enactment of such a rule. If preemptive regulation slowed or ended certain information practices, it could stifle the provision of new and better services that consumers demand, as I have noted elsewhere. It might also trump other choices or values that consumers care about. While privacy is obviously an incredibly important value, we cannot assume that it is the only value, or the most important value, at stake here. Consumers also care about having access to a constantly growing array of innovative goods and services, and they also care about getting those goods and services at a reasonable price.

Moving from “Rights Talk” to Practical Privacy Solutions

This is the point in the essay where some readers are getting pretty frustrated with me and thinking I am some sort of nihilist who doesn’t give a damn about privacy. I assure you that nothing is further from the truth and that I care very deeply about privacy.

But if you really care about expanding the horizons of privacy protection in our modern world, at some point you have to accept that all the “rights talk” and top-down enforcement efforts in the world are not necessarily going to help as much as you wish they would. The same thing is true for online safety, digital security, and IP protection efforts: No matter how much you might wish the opposite was true, information control is just really, really hard. Legal and regulatory approaches to bottling up information flows will inevitably be several steps behind cutting-edge technological developments. (I’ve discussed these issues in several essays here, including: “Privacy as an Information Control Regime: The Challenges Ahead,” “Copyright, Privacy, Property Rights & Information Control: Common Themes, Common Challenges,” and “When It Comes to Information Control, Everybody Has a Pet Issue & Everyone Will Be Disappointed.”)

That doesn’t mean we should surrender in our efforts to identify more concrete privacy harms, but we should recognize that it will always be a hugely contentious matter and that a great many people will gladly trade away their privacy in a way that others will consider outrageous. In a free society, we must allow them to do so if they derive greater utility from other things. A paternalistic approach based on a sort of privacy fundamentalism will deny them the right to make that choice for themselves. And, practically speaking, no matter how much some might think that privacy values are “unalienable,” the reality is that there will be no way to stop many others from making different choices and relinquishing their privacy all the time.

Educating and empowering citizens is the better way to address this issue. We can try to teach them to make better privacy choices and treat their information, and information about others, with far greater care. We should also work to provide citizens more tools to help accomplish those goals. And if the problem is “information asymmetry” or some general lack of awareness about certain data collection and use practices, then let’s work even harder to make sure consumers are aware of those practices and what they can do about them.

It’s all part of the media literacy and digital citizenship agenda that we need to be investing much more of time and resources into. I outlined that approach in much more detail in this law review article. We need diverse tools and strategies for a diverse citizenry. We need to be talking to both consumers and developers about smarter data hygiene and sensible digital ethics. We need more transparency. We need more privacy privacy professionals working inside organizations to craft sensible data collection and use policies. And so on. Only by working to change attitudes about privacy, online “Netiquette,” and more ethical data use, can we really start to make a dent in this problem.

If nothing else, we must understand the limitations of information control in such highly context-specific harm scenarios. Prof. Rosen might want to ask himself how long it would take to even get his proposed constitutional amendment in place and what the chances are such a movement would even been successful. But, again, and far more importantly, Prof. Rosen and advocates of similar regulatory approaches should remember that their values are not shared by everyone and that, in a free society, a value as inherently subjective as privacy is likely to remain a hugely contentious, every-changing matter, especially when elevated to the level of constitutional rights talk. We need practical solutions to our privacy problems, not pie-in-the-sky Hail Mary schemes that are unlikely to go anywhere and, even if they did, would end up being too heavy-handed and potentially override individual autonomy in the process.

Previous post:

Next post: