Online Child Safety

Sen. Edward J. Markey (D-Mass.) and Rep. Joe Barton (R-Texas) have reintroduced their “Do Not Track Kids Act,” which, according to this press release, “amends the historic Children’s Online Privacy Protection Act of 1998 (COPPA), will extend, enhance and update the provisions relating to the collection, use and disclosure of children’s personal information and establishes new protections for personal information of children and teens.” I quickly scanned the new bill and it looks very similar to their previous bill of the same name that they introduced in 2011 and which I wrote about here and then critiqued at much greater length in a subsequent Mercatus Center working paper (“Kids, Privacy, Free Speech & the Internet: Finding The Right Balance”).

Since not much appears to have changed, I would just encourage you to check out my old working paper for a discussion of why this legislation raises a variety of technical and constitutional issues. But I remain perplexed by how supporters of this bill think they can devise age-stratified online privacy protections without requiring full-blown age verification for all Internet users. And once you go down that path, as I note in my paper, you open up a huge Pandora’s Box of problems that we have already grappled with for many years now. As I noted in my paper, the real irony here is that the “problem with these efforts is that expanding COPPA would require the collection of more personal information about kids and parents. For age verification to be effective at the scale of the Internet, the collection of massive amounts of additional data is necessary.” Continue reading →

California’s continuing effort to make the Internet their own digital fiefdom continued this week with Gov. Jerry Brown signed legislation that creates an online “Eraser Button” just for minors. The law isn’t quite as sweeping as the seriously misguided “right to be forgotten” notion I’ve critique here (1, 2, 3, 4) and elsewhere (5, 6) before. In any event, the new California law will:

require the operator of an Internet Web site, online service, online application, or mobile application to permit a minor, who is a registered user of the operator’s Internet Web site, online service, online application, or mobile application, to remove, or to request and obtain removal of, content or information posted on the operator’s Internet Web site, service, or application by the minor, unless the content or information was posted by a 3rd party, any other provision of state or federal law requires the operator or 3rd party to maintain the content or information, or the operator anonymizes the content or information. The bill would require the operator to provide notice to a minor that the minor may remove the content or information, as specified.

As always, the very best of intentions motivate this proposal. There’s no doubt that some digital footprints left online by minors could come back to haunt them in the future, and that concern for their future reputation and privacy is the primary motivation for the measure. Alas, noble-minded laws like these often lead to many unintended consequences, and even some thorny constitutional issues. I’d be hard-pressed to do a better job of itemizing those potential problems than Eric Goldman, of Santa Clara University School of Law, and Stephen Balkam, Founder and CEO of the Family Online Safety Institute, have done in recent essays on the issue. Continue reading →

Last month, it was my great pleasure to serve as a “provocateur” at the IAPP’s (Int’l Assoc. of Privacy Professionals) annual “Navigate” conference. The event brought together a diverse audience and set of speakers from across the globe to discuss how to deal with the various privacy concerns associated with current and emerging technologies.

My remarks focused on a theme I have developed here for years: There are no simple, silver-bullet solutions to complex problems such as online safety, security, and privacy. Instead, only a “layered” approach incorporating many different solutions–education, media literacy, digital citizenship, evolving society norms, self-regulation, and targeted enforcement of existing legal standards–can really help us solve these problems. Even then, new challenges will present themselves as technology continues to evolve and evade traditional controls, solutions, or norms. It’s a never-ending game, and that’s why education must be our first-order solution. It better prepares us for an uncertain future. (I explained this approach in far more detail in this law review article.)

Anyway, if you’re interested in an 11-minute video of me saying all that, here ya go. Also, down below I have listed several of the recent essays, papers, and law review articles I have done on this issue.


Continue reading →

This afternoon, Berin Szoka asked me to participate in a TechFreedom conference on “COPPA: Past, Present & Future of Children’s Privacy & Media.” [CSPAN video is here.] It was a in-depth, 3-hour, 2-panel discussion of the Federal Trade Commission’s recent revisions to the rules issued under the 1998 Children’s Online Privacy Protection Act (COPPA).

While most of the other panelists were focused on the devilish details about how COPPA works in practice (or at least should work in practice), I decided to ask a more provocative question to really shake up the discussion: What are we going to do when COPPA fails?

My notes for the event follow down below. I didn’t have time to put them into a smooth narrative, so please pardon the bullet points. Continue reading →

Today I had the great pleasure of moderating a panel discussion at a conference on the “Virtual Economy” hosted by Thomson Reuters and the International Center for Missing and Exploited Children. On my panel were representatives from the Bitcoin Foundation, the Tor Project, and the DOJ, and we had a lively discussion about how these technologies can potentially be used by criminals and what these open source communities might be able to do to mitigate that risk.

The bottom line message that came out of the panel (and indeed every panel) is that the Tor and Bitcoin communities do not like to see the technologies they develop put to evil uses, and that they are more than willing to work with policymakers and law enforcement to the extent that they can. On the flip side, the message to regulators was that they need to be more open, inclusive, and transparent in their decision making if they expect cooperation from these communities.

I was therefore interested in the keynote remarks delivered by Jennifer Shasky Calvery, the Director of the Treasury Department’s Financial Crimes Enforcement Network. In particular, she addressed the fact that since there have been several enforcement actions against virtual currency exchangers and providers, the traditional banking sector has been wary of doing business with companies in the virtual currency space. She said:

I do want to address the issue of virtual currency administrators and exchangers maintaining access to the banking system in light of the recent action against Liberty Reserve. Again, keep in mind the combined actions by the Department of Justice and FinCEN took down a $6 billion money laundering operation, the biggest in U.S. history.

We can understand the concerns that these actions may create a broad-brush, reaction from banks. Banks need to assess their risk tolerance and the risks any particular client might pose. That’s their obligation and that’s what we expect them to do.

And this goes back to my earlier points about corporate responsibility and why it is in the best interest of virtual currency administrators and exchangers to comply with their regulatory responsibilities. Banks are more likely to associate themselves with registered, compliant, transparent businesses. And our guidance should help virtual currency administrators and providers become compliant, well-established businesses that banks will regard as desirable and profitable customers.

While it’s true that FinCEN’s March guidance provides clarity for many actors in the Bitcoin space, it is nevertheless very ambiguous about other actors. For example, is a Bitcoin miner who sells for dollars the bitcoins he mines subject to regulation? If I buy those bitcoins, hold them for a time as an investment, and then resell them for dollars, am I subject to regulation? In neither case are bitcoins acquired to purchase goods or services (the only use-case clearly not regulated according to the guidance). And even if one is clearly subject to the regulations, say as an exchanger, it takes millions of dollars and potentially years of work to comply with state licensing and other requirements. My concern is that banks will not do business with Bitcoin start-ups not because they pose any real criminal risk, but because there is too much regulatory uncertainty.

My sincere hope is that banks do not interpret Ms. Shasky Calvery’s comments as validation of their risk-aversion. Banks and other financial institutions should be careful about who they do business with, and they certainly should not do business with criminals, but it would be a shame if they felt they couldn’t do business with an innovative new kind of start-up simply because that start-up has not been (and may never be) adequately defined by a regulator. Unfortunately, I fear banks may take the comments to suggest just that, putting start-ups in limbo.

Entrepreneurs may want to comply with regulation in order to get banking services, and they may do everything they think they have to in order to comply, but the banks may nevertheless not want to take the risk given that the FinCEN guidance is so ambiguous. I asked Ms. Shasky Calvery if there was a way entrepreneurs could seek clarification on the guidance, and she said they could call FinCEN’s toll-free regulatory helpline at (800) 949–2732. That may not be very satisfying to some, but it’s a start. And I hope that any clarification that emerges from conversations with FinCEN are made public by the agency so that others can learn from it.

All in all, I think today we saw the first tentative steps toward a deeper conversation between Bitcoin entrepreneurs and users on the one hand, and regulators and law enforcement on the other. That’s a good thing. But I hope regulators understand that it’s not just the regulations they promulgate that have consequences for regulated entities, it’s also the uncertainty they can create through inaction.

Ms. Shasky Calvery also said:

Some in the press speculated that our guidance was an attempt to clamp down on virtual currency providers. I will not deny that there are some troublesome providers out there. But, that is balanced by a recognition of the innovation these virtual currencies provide, and the financial inclusion that they might offer society. A whole host of emerging technologies in the financial sector have proven their capacity to empower customers, encourage the development of innovative financial products, and expand access to financial services. And we want these advances to continue.

That is a welcome sentiment, but those advances can only continue if there are clear rules made in consultation with regulated parties and the general public. Hopefully FinCEN will revisit its guidance now that the conversation has begun, and as other regulators consider new rules, they will hopefully engage the Bitcoin community early in order to avoid ambiguity and uncertainty.

With renewed interest in the failings of the Computer Fraud and Abuse Act and the role of prosecutorial discretion in its application in light of the tragic outcome in the Aaron Swartz case, I went back to what I wrote about the law in 2009.

Back then, the victim of both the poorly-drafted amendments to CFAA that expanded its scope from government to private computer networks and the politically-motivated zeal of federal prosecutors reaching for something—anything—with which to punish otherwise legal but disfavored behavior was trained on Lori Drew, a far less sympathetic defendant.

But the dangers lurking in the CFAA were just as visible in 2009 as they are today.  Those who have recently picked up the banner calling for reform of the law might ask themselves where they were back then, and why the ultimately unsuccessful Drew prosecution didn’t raise their hackles at the time.

The law was just as bad in 2009, and just as dangerously twisted by the government.  Indeed, the Drew case, as I wrote at the time, gave all the notice anyone needed of what was to come later. Continue reading →

Defining “privacy” is a legal and philosophical nightmare. Few concepts engender more definitional controversies and catfights. As someone who is passionate about his own personal privacy — but also highly skeptical of top-down governmental attempts to regulate and/or protect it — I continue to be captivated by the intellectual wrangling that has taken place over the definition of privacy. Here are some thoughts from a wide variety of scholars that make it clear just how frustrating this endeavor can be:

  • Perhaps the most striking thing about the right to privacy is that nobody seems to have any very clear idea what it is.” – Judith Jarvis Thomson, “The Right to Privacy,” in Philosophical Dimensions of Privacy: An Anthology, 272, 272 (Ferdinand David Schoeman ed., 1984).
  • privacy is “exasperatingly vague and evanescent.” – Arthur Miller, The Assault on Privacy: Computers, Data Banks, and Dossiers, 25 (1971).
  • [T]he concept of privacy is infected with pernicious ambiguities.” – Hyman Gross,  The Concept of Privacy, 42 N.Y.U. L. REV. 34, 35 (1967).
  • Attempts to define the concept of ‘privacy’ have generally not met with any success.” – Colin Bennett, Regulating Privacy: Data Protection and Public Policy In Europe and the United States,  25 (1992).
  • When it comes to privacy, there are many inductive rules, but very few universally accepted axioms.” - David Brin, The Transparent Society: Will Technology Force Us To Choose Between Privacy and Freedom? 77 (1998).
  • Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.” – Robert C. Post, Three Concepts of Privacy, 89 GEO. L.J. 2087, 2087 (2001).
  • [privacy] can mean almost anything to anybody.” – Fred H. Cate & Robert Litan, Constitutional Issues in Information Privacy, 9 Mich. Telecomm. & Tech. L. Rev. 35, 37 (2002).
  • privacy has long been a “conceptual jungle” and a “concept in disarray.” “[T]he attempt to locate the ‘essential’ or ‘core’ characteristics of privacy has led to failure.” – Daniel J. Solove, Understanding Privacy 196, 8 (2008).
  • Privacy has really ceased to be helpful as a term to guide policy in the United States.” - Woodrow Hartzog, quoted in Cord Jefferson, Spies Like Us: We’re All Big Brother Now, Gizmodo, Sept. 27, 2012.
  • for most consumers and policymakers, privacy is not a rational topic. It’s a visceral subject, one on which logical arguments are largely wasted.” – Larry Downes,  A Rational Response to the Privacy “Crisis,” Cato Institute, Policy Analysis No. 716 (Jan. 7, 2013), at 6.

In my new Harvard Journal of Law & Public Policy article, “The Pursuit of Privacy in a World Where Information Control is Failing” I build on these insights to argue that: Continue reading →

Sean Flaim

Sean Flaim, an attorney focusing on antitrust, intellectual property, cyberlaw, and privacy, discusses his new paper “Copyright Conspiracy: How the New Copyright Alert System May Violate the Sherman Act,” recently published in the New York University Journal of Intellectual Property and Entertainment Law.

Flaim describes content owners early attempts to enforce copyright through lawsuit as a “public relations nightmare” that humanized piracy and created outrage over large fines imposed on casual downloaders. According to Flaim, the Copyright Alert System is a more nuanced approach by the content industry to crack down on copyright infringement online, which arose in response to a government failure to update copyright law to reflect the nature of modern information exchange.

Flaim explains the six stages of the Copyright Alert System in action, noting his own suspicions about the program’s states intent as a education tool for repeat violators of copyright law online. In addition to antitrust concerns, Flaim worries that appropriate cost-benefit analysis has not been applied to this private regulation system, and, ultimately, that private companies are being granted a government-like power to punish individuals for breaking the law.

Download

Related Links

It was my honor today to be a panelist at a Hill event on “Apps, Ads, Kids & COPPA: Implications of the FTC’s Additional Proposed Revisions,” which was co-sponsored by the Family Online Safety Institute and the Association for Competitive Technology. It was a free-wheeling discussion, but I prepared some talking points for the event that I thought I would share here for anyone interested in my views about the Federal Trade Commission’s latest proposed revisions to the Children’s Online Privacy Protection Act (COPPA).

________

The Commission deserves credit for very wisely ignoring calls by some to extend the coverage of COPPA’s regulatory provisions from children under 13 all the way up to teens up to 18.

  • that would have been a constitutional and technical enforcement nightmare. But the FTC realized that long ago and abandoned any thought of doing that. So that is a huge win since we won’t be revisiting the COPA age verification wars.
  • That being said, each tweak or expansion of COPPA, the FTC opens the door a bit wider to a discussion of some sort age verification or age stratification scheme for the Internet.
  • And we know from recent AG activity (recall old MySpace age verification battle) and Hill activity (i.e. Markey-Barton bill) that there remains an appetite for doing something more to age-segment Internet populations

Continue reading →

Stefan Krappitz, writer of the book Troll Culture: A Comprehensive Guide, discusses the phenomenon of internet trolling. For Krappitz trolling is disrupting people for personal amusement. Trolling is largely a positive phenomenon argues Krappitz. While it can become very negative in some cases, for the most part trolling is simply an amusing practice that is no different than playing practical jokes. Krappitz believes that trolling has been around since before the age of the internet. He notes that the behavior of Socrates is reminiscent of trolling because he pretended to be a student and then used his questioning to mock people who did not know what they were talking about. Krappitz also discusses anonymity and how it contributes and takes away from trolling as well as discussing where the line is between good trolling and cyber-bullying.


Download

Related Links