Privacy, Security & Government Surveillance

[UPDATE: 2/14/2013: As noted here, this paper was published by the Minnesota Journal of Law, Science & Technology in their Winter 2013 edition. Please refer to that post for more details and cite this final version of the paper going forward.]

I’m pleased to report that the Mercatus Center at George Mason University has just released my huge new white paper, “Technopanics, Threat Inflation, and the Danger of an Information Technology Precautionary Principle.” I’ve been working on this paper for a long time and look forward to finding it a home in a law journal some time soon.  Here’s the summary of this 80-page paper:

Fear is an extremely powerful motivating force, especially in public policy debates where it is used in an attempt to sway opinion or bolster the case for action. Often, this action involves preemptive regulation based on false assumptions and evidence. Such fears are frequently on display in the Internet policy arena and take the form of full-blown “technopanic,” or real-world manifestations of this illogical fear. While it’s true that cyberspace has its fair share of troublemakers, there is no evidence that the Internet is leading to greater problems for society. This paper considers the structure of fear appeal arguments in technology policy debates and then outlines how those arguments can be deconstructed and refuted in both cultural and economic contexts. Several examples of fear appeal arguments are offered with a particular focus on online child safety, digital privacy, and cybersecurity. The  various  factors  contributing  to  “fear  cycles”  in these policy areas are documented. To the extent that these concerns are valid, they are best addressed by ongoing societal learning, experimentation, resiliency, and coping strategies rather than by regulation. If steps must be taken to address these concerns, education and empowerment-based solutions represent superior approaches to dealing with them compared to a precautionary principle approach, which would limit beneficial learning opportunities and retard technological progress.

The complete paper can be found on the Mercatus site here, on SSRN, or on Scribd.  I’ve also embedded it below in a Scribd reader. Continue reading →

Over at Forbes I have posted some thoughts on the new privacy framework (Consumer Data Privacy in a Networked World) that the Obama Administration released today. In my essay, “The Problem with Obama’s “Let’s Be More Like Europe” Privacy Plan,” I hammer home the same point I’ve made here before many times: Regulation is not a costless exercise. No matter how well-intentioned regulatory proposals may be, they can often have unforeseen, unintended consequences. This is equally true for privacy controls. I discuss how a new privacy regulatory regime could drive up prices for services that currently are free or inexpensive, limit new digital services and innovations, create barriers to entry for new entrants and entrepreneurs, negatively impact the competitiveness of existing U.S. Internet operators, and, more generally, increase the horizons of government power over the Internet.

For a more detailed analysis of these issues, I encourage you to check out my big Mercatus Center filing to the FTC last year on privacy and Do Not Track regulation. Also, here are few TLF essays that summarize my skepticism about expanded privacy controls:

The White House’s “Consumer Data Privacy in a Networked World” report outlines a revised framework for consumer privacy, proposes a “Consumer Privacy Bill of Rights,” and calls on Congress to pass new legislation to regulate online businesses. The following statement can be attributed to Berin Szoka, President of TechFreedom, and Larry Downes, TechFreedom Senior Adjunct Fellow:

This Report begins and ends as constitutional sleight-of-hand. President Obama starts by reminding us of the Fourth Amendment’s essential protection against “unlawful intrusion into our homes and our personal papers”—by government. But the Report recommends no reform whatsoever for outdated laws that have facilitated a dangerous expansion of electronic surveillance. That is the true threat to our privacy. The report dismisses it in a footnote.

Instead, the Report calls for extensive new regulation of Internet businesses to address little more than the growing pains of a vibrant emerging economy. “For businesses to succeed online,” President Obama asserts, “consumers must feel secure.”  Yet online businesses that rely on data to deliver innovative and generally free services are the one bright spot in a sour economy. Experience has shown consumers ultimately bear the costs of regulations imposed on emerging technologies, no matter how well-intentioned.

The report is a missed opportunity. The Administration should have called for increased protections against government’s privacy intrusions. Focusing on the real Bill of Rights would have respected not only the Fourth Amendment, but also the First Amendment. The Supreme Court made clear last year that the private sector’s use of data is protected speech—an issue also not addressed by this Report.

Szoka and Downes are available for comment at media@techfreedom.org.

Given the importance of privacy self-help—that is, setting your browser to control what it reveals about you when you surf the Web—I was concerned to hear that Google, among others, had circumvented third-party cookie blocking that is a default setting of Apple’s Safari browser. Jonathan Mayer of Stanford’s Center for Internet and Society published a thorough and highly technical explanation of the problem on Thursday.

The story starts with a flaw in Safari’s cookie blocking. Mayer notes Safari’s treatment of third-party cookies:

Reading Cookies Safari allows third-party domains to read cookies.
Modifying Cookies If an HTTP request to a third-party domain includes a cookie, Safari allows the response to write cookies.
Form Submission If an HTTP request to a third-party domain is caused by the submission of an HTML form, Safari allows the response to write cookies. This component of the policy was removed from WebKit, the open source browser behind Safari, seven months ago by Google engineers. Their rationale is not public; the bug is marked as a security problem. The change has not yet landed in Safari.

Mayer says Google was exploiting this yet-to-be-closed loophole to install third-party cookies, the domain of which Safari would then allow to write cookies. After describing “(relatively) straightforward” cookie synching, Mayer says:

But we noticed a special response at the last step for Safari browsers. … Instead of responding with the “_drt_” cookie, the server sends back a page that includes a form and JavaScript to submit the form (using POST) to its own URL.

Third-party cookie blocking evaded, and users’ preferences frustrated.

Ars Technica has published Google’s response, which doesn’t seem to have gone up on any of its blogs, in full. Google says they created this functionality to deliver better services to their users, but doing so inadvertently allowed Google advertising cookies to be set on the browser.

I don’t know that I’m technically sophisticated enough to register a firm judgement, but it looks to me like Google was faced with an interesting dilemma: They had visitors who were signed in to their service and who had opted to see personalized ads and other content, such as ‘+1’s but those same visitors had set their browsers contrary to those desires. Google chose the route better for Google, defeating the browser-set preferences. That, I think, was a mistake.

I wonder if there isn’t some Occam’s Razor that a Google engineer might have applied at some point in this process, thinking, “Golly, we are really going to great lengths to get around a browser setting. Are we sure we should be doing this?” Maybe it would have been more straightforward to highlight to Safari users that their settings were reducing their enjoyment of Google’s services and ads, and to invite those users to change their settings. This, and urging Apple to fix the browser, would have been more consistent with the company’s credo of non-evil.

Now, to the ideological stuff, of which I can think of two items:

1) There is a battle for control of earth out there—well, a battle over whether third-party cookie blocking is good or bad. Have your way advocates. I think the consuming public—that is, the market—should decide.

2) There is a battle to make a federal case out of every privacy transgression. An advocacy group called Consumer Watchdog (which has been prone to privacy buffoonery in the past) hustled out a complaint to the Federal Trade Commission. I think the injured parties should be compensated in full for their loss and suffering, of which there wasn’t any. De minimis non curat lex, so this is actually just a learning opportunity for Google, for browser authors, and for the public.

Kudos and thanks are due to Jonathan Mayer, as well as ★★★★★ and Ashkan Soltani, for exposing this issue.

Today the Federal Trade Commission released a new report entitled, “Mobile Apps for Kids: Current Privacy Disclosures Are Disappointing,” which concludes that “confusing and hard-to-find disclosures do not give parents the control that they need in this area. The FTC argues that “parents need consistent, easily accessible, and recognizable disclosures regarding in-app purchase capabilities so that they can make informed decisions about whether to allow their children to use apps with such capabilities.”

It’s hard to be against the FTC’s “the more disclosure, the better” policy recommendation and I’m not about to come out against it here. But the question is: how much disclosure is enough? Reading through the report and seeing how hard the FTC hammers this point home makes me think the agency wants our app store checkout process to be littered with the pages of fine print disclosure policies that now accompany our credit card statements and home mortgage payments! Seriously, would that make us better off?

As a parent of two kids who both download countless apps on my Android phone, my wife’s iPhone, and our family’s Android tablet, I appreciate a certain amount of disclosure about what sort of information apps are collecting and how they are using it. I think Google’s Android marketplace strikes a nice balance here, providing us with the most crucial facts about what the application will access or share. Apple could do more on disclosure but the company also prides itself (to the dismay of some!) on its rigorous pre-screening process to make sure the apps in the App Store are safe and don’t violate certain privacy and security policies. Yet, as the FTC correctly points out, “the details of this screening process are not clear.” Of course, most Apple users simply don’t give a damn. They’re all too happy to let Apple just take care of it for them even if they’re not really sure what’s happening to their data behind the scenes. The more privacy-sensitive crowd wants greater disclosure and control, of course, and I’m sympathetic to that plea.  But again, how much disclosure is enough? Are you going to wade through pages of disclosure policies and privacy opt-ins before downloading that latest iteration of “Angry Birds” or “Cut the Rope”? Yeah, I didn’t think so.

Anyway, I don’t want to dwell on that. The more interested findings in the survey relate to price and market dynamics and I am hoping people don’t ignore them. Continue reading →

My seen-it-all cool was shaken yesterday when I examined how a Senate cybersecurity bill would scythe down legal protections for privacy. Anyone participating in government “cybersecurity exchanges” would have nearly total immunity from liability under any law. No Privacy Act, no ECPA, no E-Government Act, no contract law, no privacy torts. The scuttlebutt is that Senator Reid (D-NV) may push this especially hard as payback to the Internet for the SOPA/PIPA debacle.

In the push for cybersecurity legislation, Congress is driven far more by its desire to act (and D.C. lobbyists’ desire to have Congress act) than by any plausible contribution it can make to the difficult problem of securing computers, networks, and data. That’s why this cybersecurity bill, and all others I have seen, have greater costs than benefits.

Read about the devastation for privacy and the rule of law on offer in a current draft in “The Senate’s SOPA Counterattack?: Cybersecurity the Undoing of Privacy.”

on the Google privacy policy change.

The idea that people should be able to opt out of a company’s privacy policy strikes me as ludicrous.

Plus she embeds a valuable discussion among her Xtranormal friends.

Read the whole thing. Watch the whole thing. And, if you actually care, take some initiative to protect your privacy from Google, a thing you are well-empowered to do by the browser and computer you are using to view this post.

http://www.youtube.com/v/7jHxfJW7Zww&rel=0&hl=en_US&feature=player_embedded&version=3

Over at TIME.com I write that if you didn’t like SOPA because it threatened free speech, then you probably won’t like the new “Right to be Forgotten” proposed in the EU. Prof. Jane Yakowitz contributes some great insights to the piece. What I dislike most about the rule is that it subordinates expression to privacy:

[T]he new law would flip the traditional understanding of privacy as an exception to free speech. What this means is that if we treat free expression as the more important value, then one has to prove a harmful violation of privacy before the speaker can be silenced. Under the proposed law, however, it’s the speaker who must show that his speech is a “legitimate” exception to a claim of privacy. That is, the burden of proof is switched so that speakers are the ones who would have to justify their speech.

Read the whole thing at TIME.com.

According to the BBC, the European Commission is apparently set to adopt formal rules guaranteeing a so-called “right to be forgotten” online.  As part of the Commission’s overhaul of the 1995 Data Protection Directive, this new regulation will mandate that, “people will be able to ask for data about them to be deleted and firms will have to comply unless there are ‘legitimate’ grounds to retain it,” the BBC reports.

I’ve written about “right to be forgotten” and “online eraser button” proposals before in my Forbes essay, “Erasing Our Past On The Internet,” a Mercatus white paper on “Kids, Privacy, Free Speech & the Internet: Finding the Right Balance.” and in this essay here on the TLF on “The Conflict Between a “Right to Be Forgotten” & Speech / Press Freedoms.” While I can appreciate the privacy and reputational concerns that lead to calls for such information controls, the reality is that a mandatory “right to be forgotten” is a recipe for massive Internet censorship.  As I noted in those earlier essays, such notions conflict violently with speech rights and press freedoms. Enshrining into law such expansive privacy norms places stricter limits on others’ rights to speak freely, or to collect and analyze information about others. Continue reading →

Today, the Supreme Court issued its decision in U.S. v. Jones, unanimously holding that law enforcement violated the Fourth Amendment by affixing a GPS tracker to a vehicle to monitor its movements without obtaining a search warrant from a court. The following statement can be attributed to Berin Szoka, President of TechFreedom:

This was an easy case: law enforcement plainly trespassed on private property protected by the Fourth Amendment. But as the majority notes, today’s holding is only the bare minimum of the Constitution’s protections. The harder question awaits the Court: When does purely electronic surveillance—without physical trespass—violate the Fourth Amendment?

At the very least, the Court must reconsider the “third party” doctrine invented by lower courts, which denies us protection for information we share with trusted third parties like “cloud” services that host our email, photos, and documents. The Court should make clear that Fourth Amendment protections hinge not on keeping information secret, but on whether we take steps to preserve that information as private. That, not the “reasonable expectation of privacy,” is the standard the Court applied in its landmark 1967 Katz decision. It is also the only standard that will effectively protect Americans’ privacy in the digital age.

[Cross posted at TechFreedom.org]