Privacy Solutions

I spaced out and completely forget to post a link here to my latest Forbes column which came out over the weekend.  It’s a look at back at last week’s hullabaloo over “Apple, The iPhone, and a Locational Privacy Techno-Panic.” In it, I argue:

Some of the concerns raised about the retention of locational data are valid. But panic, prohibition and a “privacy precautionary principle” that would preemptively block technological innovation until government regulators give their blessings are not valid answers to these concerns. The struggle to conceptualize and protect privacy rights should be an evolutionary and experimental process, not one micro-managed at every turn by regulation.

I conclude the piece by noting that:

Public pressure and market norms also encourage companies to correct bone-headed mistakes like the locational info retained by Apple.  But we shouldn’t expect less data collection or less “tracking” any time soon.  Information powers the digital economy, and we must learn to assimilate new technology into our lives.

Read the rest here. And if you missed essay Larry Downes posted here on the same subject last week, make sure to check it out.

Inspired by thoughtful pieces by Mike Masnick on Techdirt and L. Gordon Crovitz’s column yesterday in The Wall Street Journal, I wrote a perspective piece this morning for CNET regarding the European Commission’s recently proposed “right to be forgotten.”

A Nov. 4th report promises new legislation next year “clarifying” this right under EU law, suggesting not only that the Commission thinks it’s a good idea but, even more surprising, that it already exists under the landmark 1995 Privacy Directive.

What is the “right to be forgotten”?  The report is cryptic and awkward on this important point, describing “the so-called ‘right to be forgotten’, i.e. the right of individuals to have their data no longer processed and deleted when they [that is, the data] are no longer needed for legitimate purposes.”

Continue reading →

I participated last week in a Techdirt webinar titled, “What IT needs to know about Law.”  (You can read Dennis Yang’s summary here, or follow his link to watch the full one-hour discussion.  Free registration required.)

The key message of  The Laws of Disruption is that IT and other executives need to know a great deal about law—and more all the time.  And Techdirt does an admirable job of reporting the latest breakdowns between innovation and regulation on a daily basis.  So I was happy to participate.

Legally-Defensible Security

Not surprisingly, there were far too many topics to cover in a single seminar, so we decided to focus narrowly on just one:  potential legal liability when data security is breached, whether through negligence (lost laptop) or the criminal act of a third party (hacking attacks).  We were fortunate to have as the main presenter David Navetta, founding partner with The Information Law Group, who had recently written an excellent article on what he calls “legally-defensible security” practices.

Continue reading →

At the FTC’s second Exploring Privacy roundtable at Berkeley in January, many of the complaints about online advertising centered on how difficult it was to control the settings for Adobe’s Flash player, which is used to display ads, videos and a wide variety on other graphic elements on most modern webpages, as well the potential for unscrupulous data collectors to “re-spawn” standard (HTTP) cookies even after a user deleted them simply by referencing the Flash cookie on a user’s computer from that domain—thus circumventing the user’s attempt to clear out their own cookies. Adobe to the first criticism by promising to include better privacy management features in Flash 10.1 and by condemning such re-spawning and calling for “a mix of technology tools and regulatory efforts” to deal with the problem (including FTC enforcement). (Adobe’s filing offers a great history of Flash, a summary of its use and an introduction to Flash Cookies, which Adam Marcus detailed here.)

Earlier this week (and less than three weeks later), Adobe rolled out Flash 10.1, which offers an ingenious solution to the problem of how to manage flash cookies: Flash now simply integrates its privacy controls with Internet Explorer, Firefox and Chrome (and will soon do so with Safari). So when the user turns on “private browsing mode” in these browser, the Flash Cookies will be stored only temporarily, allowing users to use the full functionality of the site, but the Flash Player will “automatically clear any data it might store during a private browsing session, helping to keep your history private.” That’s a pretty big step and an elegantly simple to the problem of how to empower users to take control of their own privacy. Moreover:

Flash Player separates the local storage used in normal browsing from the local storage used during private browsing. So when you enter private browsing mode, sites that you previously visited will not be able to see information they saved on your computer during normal browsing. For example, if you saved your login and password in a web application powered by Flash during normal browsing, the site won’t remember that information when you visit the site under private browsing, keeping your identity private.

Continue reading →

By Eric Beach and Adam Marcus

In the previous entry in the Privacy Solutions Series, we described how privacy-sensitive users can use proxy servers to anonymize their web browsing experience, noting that one anonymizer stood out above all others: Tor, a sophisticated anonymizer system developed by the Tor Project, a 501(c)(3) U.S. non-profit venture supported by industry, privacy advocates and foundations, whose mission is to “allow you to protect your Internet traffic from analysis.” The Torbutton plug-in for Firefox makes it particularly easy to use Tor and has been downloaded over three million times. The TorBrowser Bundle is a pre-configured “portable” package of Tor and Firefox that can run off a USB flash drive and does not require anything to be installed on the computer on which it is used. Like most tools in the Privacy Solutions series, Tor has its downsides and isn’t for everyone. But it does offer a powerful tool to privacy-sensitive users in achieving a degree of privacy that no regulation could provide.

Continue reading →

By Eric Beach & Adam Marcus

Among Internet users, there are a variety of concerns about privacy, security and the ability to access content. Some of these concerns are quite serious, while others may be more debatable. Regardless, the goal of this ongoing series is to detail the tools available to users to implement their own subjective preferences. Anonymizers (such as Tor) allow privacy-sensitive users to protect themselves from the following potential privacy intrusions:

  1. Advertisers Profiling Users. Many online advertising networks build profiles of likely interests associated with a unique cookie ID and/or IP address. Whether this assembling of a “digital dossier” causes any harm to the user is debatable, but users concerned about such profiles can use an anonymizer to make it difficult to build such profiles, particularly by changing their IP address regularly.
  2. Compilation and Disclosure of Search Histories. Some privacy advocates such as EFF and CDT have expressed legitimate concern at the trend of governments subpoenaing records of the Internet activity of citizens. By causing thousands of users’ activity to be pooled together under a single IP address, anonymizers make it difficult for search engines and other websites–and, therefore, governments–to distinguish the web activities of individual users.
  3. Government Censorship. Some governments prevent their citizens from accessing certain websites by blocking requests to specific IP addresses. But an anonymizer located outside the censoring country can serve as an intermediary, enabling the end-user to circumvent censorship and access the restricted content.
  4. Reverse IP Hacking. Some Internet users may fear that the disclosure of their IP address to a website could increase their risk of being hacked. They can use an anonymizer as an intermediary between themselves and the website, thus preventing disclosure of their IP address to the website.
  5. Traffic Filtering. Some ISPs and access points allocate their Internet bandwidth depending on which websites users are accessing. For example, bandwidth for information from educational websites may be prioritized over Voice-over-IP bandwidth. Under certain circumstances, an anonymizer can obscure the final destination of the end-user’s request, thereby preventing network operators or other intermediaries from shaping traffic in this manner. (Note, though, that to prevent deep packet inspection, an anonymizer must also encrypt data).

Continue reading →

Remember, remember the Fifth of November,
The Gunpowder Treason and Plot Privacy Dashboard, so hot,
I know of no reason

Why the Gunpowder Treason Privacy Dashboard
Should ever be forgot.

Sorry, I couldn’t resist, this being Guy Fawkes day (a major traditional holiday for Britons and, more recently, geeky American libertarians such as myself, who dress up as V for Vendetta for Halloween). Google’s announcement of its Privacy Dashboard (TechCrunch) is a major step forward in both informing users about what data Google has tied to their account in each of Google’s many products and in empowering users to easily manage their privacy settings for each product. If users decide they’d rather “take their ball and go home,” they can do that, too, by simply deleting their data.

Users can access the dashboard at www.google.com/dashboard (duh). Or, from the Google homepage, you just have to:

  1. Click on Settings at the top right > Google Account Settings
  2. Click on “View data stored with this account” next to “Dashboard”

Once you log-in (for extra security), you can:

  1. See what data is associated with your account in 23 of Google’s products (Google notes that it will incorporate its 18 other products in the near future).
  2. Directly access the privacy management settings for that account.
  3. Access more information—”Links to relevant help articles and information pages.”

Some critics have complained in the past that it’s too hard to find privacy settings links on Google and other sites. Indeed, Google could have made it easier—and now they have! Google has taken another major step forward in user education and empowerment—just as it pioneered transparency into its interest-based advertising product with the Ad Preference Manager launched in March (which I applauded here). (The Dashboard is only for data tied to a user’s Google account, while the APM is tied only to a cookie on the user’s computer.)

The Dashboard really couldn’t be much easier to use—yet we can be sure it won’t be good enough for some privacy zealots who arrogantly presume that their fellow homo sapiens are basically vegetables with hair—unable to use any tool online, no matter how simple, and barely able to tie their own shoelaces without government reminding them how. The principled alternative is to “Trust People & Empower Them.” Because privacy is so profoundly subjective and because there is an inherent trade-off between clamping down on data and the many benefits enjoyed by Internet users from sharing their data, Adam Thierer and I have argued for that “household standards” set by individuals should trump “community standards” imposed on everyone from above: Continue reading →

PFF summer fellow Eric Beach and I have been working on what we hope is a comprehensive taxonomy of all the threats to online security and privacy. In our continuing Privacy Solutions Series, we have discussed and will continue to discuss specific threats in more detail and offer tools and methods you can use to protect yourself.

The taxonomy is located here.

The taxonomy of 21 different threats is organized as a table that indicates the “threat vector” and goal(s) of attackers using each threat. Following the table is a glossary defining each threat and providing links to more information.Threats can come from websites, intermediaries such as an ISP, or from users themselves (e.g. using an easy-to-guess password). The goals range from simply monitoring which (or what type of) websites you access to executing malicious code on your computer.

Please share any comments, criticisms, or suggestions as to other threats or self-help privacy/security management tools that should be added by posting a comment below.

Today’s Washington Post has a story entitled U.S. Web-Tracking Plan Stirs Privacy Fears. It’s about the reversal of an ill-conceived policy adopted nine years ago to limit the use of cookies on federal Web sites.

In case you don’t already know this, a cookie is a short string of text that a server sends a browser when the browser accesses a Web page. Cookies allow servers to recognize returning users so they can serve up customized, relevant content, including tailored ads. Think of a cookie as an eyeball – who do you want to be able to see that you visited a Web site?

Your browser lets you control what happens with the cookies offered by the sites you visit. You can issue a blanket refusal of all cookies, you can accept all cookies, and you can decide which cookies to accept based on who is offering them. Here’s how:

  • Internet Explorer: Tools > Internet Options > “Privacy” tab > “Advanced” button: Select “Override automatic cookie handling” and choose among the options, then hit “OK,” and next “Apply.”

I recommend accepting first-party cookies – offered by the sites you visit – and blocking third-party cookies – offered by the content embedded in those sites, like ad networks. (I suspect Berin disagrees!) Or ask to be prompted about third-party cookies just to see how many there are on the sites you visit. If you want to block or allow specific sites, select the “Sites” button to do so. If you selected “Prompt” in cookie handling, your choices will populate the “Sites” list.

  • Firefox: Tools > Options > “Privacy” tab: In the “cookies” box, choose among the options, then hit “OK.”

I recommend checking “Accept cookies from sites” and leaving unchecked “Accept third party cookies.” Click the “Exceptions” button to give site-by-site instructions.

There are many other things you can do to protect your online privacy, of course. Because you can control cookies, a government regulation restricting cookies is needless nannying. It may marginally protect you from government tracking – they have plenty of other methods, both legitimate and illegitimate – but it won’t protect you from tracking by others, including entities who may share data with the government.

The answer to the cookie problem is personal responsibility. Did you skip over the instructions above? The nation’s cookie problem is your fault.

If society lacks awareness of cookies, Microsoft (Internet Explorer), the Mozilla Foundation (Firefox), and producers of other browsers (Apple/Safari, Google/Chrome) might consider building cookie education into new browser downloads and updates. Perhaps they should set privacy-protective defaults. That’s all up to the community of Internet users, publishers, and programmers to decide, using their influence in the marketplace. (I suspect Berin is against it!)

Artificially restricting cookies on federal Web sites needlessly hamstrings federal Web sites. When the policy was instituted it threatened to set a precedent for broader regulation of cookie use on the Web. Hopefully, the debate about whether to regulate cookies is over, but further ‘Net nannying is a constant offering of the federal government (and other elitists).

By moving away from the stultifying limitation on federal cookies, the federal government acknowledges that American grown-ups can and should look out for their own privacy.

By Eric Beach, Adam Marcus & Berin Szoka

In the first entry of the Privacy Solution Series, Berin Szoka and Adam Thierer noted that the goal of the series is “to detail the many ‘technologies of evasion’ (i.e., empowerment or user ‘self-help’ tools) that allow web surfers to better protect their privacy online.” Before outlining a few more such tools, we wanted to step back and provide a brief overview of the need for, goals of, and future scope of this series.

Smokey the Bear with signWe started this series because, to paraphrase Smokey the Bear, “Only you can protect your privacy online!” While the law can play a vital role in giving full effect to the Fourth Amendment’s restraint on government surveillance, privacy is not something that cannot simply be created or enforced by regulation because, as Cato scholar Jim Harper explains, privacy is “the subjective condition that people experience when they have power to control information about themselves.” Thus, when the appropriate technological tools and methods exist and users “exercise that power consistent with their interests and values, government regulation in the name of privacy is based only on politicians’ and bureaucrats’ guesses about what ‘privacy’ should look like.” As Berin has put it:

Debates about online privacy often seem to assume relatively homogeneous privacy preferences among Internet users. But the reality is that users vary widely, with many people demonstrating that they just don’t care who sees what they do, post or say online. Attitudes vary from application to application, of course, but that’s precisely the point: While many reflexively talk about the ‘importance of privacy’ as if a monolith of users held a single opinion, no clear consensus exists for all users, all applications and all situations.

Moreover, privacy and security are both dynamic: The ongoing evolution of the Internet, shifting expectations about online interaction, and the constant revelations of new security vulnerabilities all make it impossible to simply freeze the Internet in place. Instead, users must be actively engaged in the ongoing process of protecting their privacy and security online according to their own preferences.

Our goal is to educate users about the tools that make this task easier. Together, user education and empowerment form a powerful alternative to regulation. That alternative is “less restrictive” because regulatory mandates come with unintended consequences and can never reflect the preferences of all users.

Continue reading →