First Amendment & Free Speech

by Berin Szoka & Adam Thierer

This morning, the House Energy & Commerce Committee will hold a hearing on “Behavioral Advertising: Industry Practices And Consumers’ Expectations.” If nothing else, it promises to be quite entertaining:  With full-time Google bashers Jeff Chester and Scott Cleland on the agenda, the likelihood that top Google officials will be burned in effigy appears high!

Chester, self-appointed spokesman for what one might call the People for the Ethical Treatment of Data (PETD) movement, is sure to rant and rave about the impending techno-apocalypse that will, like all his other Chicken-Little scenarios, befall us all if online advertisers were permitted to better tailor ads to consumers’ liking. After all, can you imagine the nightmare of less annoying ads that might actually convey more useful information to consumers? Isn’t serving up “untargeted” dumb banner ads for Viagra to young women and Victoria’s Secret ads to Catholic school kids the pinnacle of modern online advertising?  Gods forbid we actually make advertising more relevant and interest-based!  (Those Catholic school boys may appreciate the lingerie ads, but few will likely buy bras.)

Anyway, according to National Journal’s Tech Daily Dose, the hearing lineup also includes:

  • Charles Curran, Executive Director, Network Advertising Initiative
  • Christopher Kelly, Chief Privacy Officer, Facebook
  • Edward Felten, Director, Center for IT Policy, Princeton University
  • Anne Toth, Chief Privacy Officer & Vice President, Policy, Yahoo!
  • Nicole Wong, Deputy General Counsel, Google

That’s an interesting group and we’re sure that they will say interesting things about the issue. Nonetheless, because four of them have a corporate affiliation that fact will inevitably be used by some critics to dismiss what they have to say about the sensibility of more targeted or interest-based forms of online advertising. So, we’d like to offer a few thoughts and pose a few questions to make sure that Committee members understand why, regardless of what it means for any particular online operator, targeting online advertising is very pro-consumer and essential to the future of online content, culture, and competition.  As Wall Street Journal technology columnist Walt Mossberg has noted, “Advertising is the mother’s milk of all the mass media.”  Much of the “free speech” we all cherish isn’t really free, but ad-supported!

Continue reading →

Rebecca MacKinnon has an important piece in the Wall Street Journal today about China’s “Green Dam Youth Escortfiltering mandate and the danger of this model catching on with other governments. “More and more governments — including democracies like Britain, Australia and Germany — are trying to control public behavior online, especially by exerting pressure on Internet service providers,” she notes. “Green Dam has only exposed the next frontier in these efforts: the personal computer.”

She’s right, and that’s cause for serious concern.  Moreover, there’s the question of how corporations doing business in China should respond to demands and threats related to installing such filters. She notes:

In a world that includes child pornographers and violent hate groups, it is probably not reasonable to oppose all censorship in all situations. But if technical censorship systems are to be put in place, they must be sufficiently transparent and accountable so that they do not become opaque extensions of incumbent power — or get hijacked by politically influential interest groups without the public knowing exactly what is going on.

Which brings us back to companies: the ones that build and run Internet and telecoms networks, host and publish speech, and that now make devices via which citizens can go online and create more speech. Companies have a duty as global citizens to do all they can to protect users’ universally recognized right to free expression, and to avoid becoming opaque extensions of incumbent power — be it in China or Britain.

I generally agree with all that but this is a difficult issue and one that I have struggled with personally. (See this “Friendly Conversation about Corporate High-Tech Engagement with China” that Jim Harper and I had three years ago).  But I do hope that more companies take a hard line with the Chinese as well as there own governemnts when it comes to filtering mandates or even restricitve parental control defaults and settings [an issue I wrote more about in this paper: “The Perils of Mandatory Parental Controls and Restrictive Defaults.”]  On that note, kudos to the business groups that already signed on to a joint letter oppossing China’s new filtering mandate.

The Gawker offers a fascinating discussion of the legal right to anonymity:

“There is clearly a moral case that some people should be able to join the public debate and retain their anonymity,” Tench told Gawker. “And I think this will have a chilling effect. Blogs like this can only exist anonymously, and I imagine that anyone who wanted to set one up is thinking about this case.”

As well they should. But the notion that anonymous publishers have a right, in perpetuity, to keep their identities a secret—or that people who learn their identities are honor-bound not to reveal them—is nonsense.

Amen! One can resist, fiercely, government efforts to reduce online anonymity through age verification or identity authentication mandates, as Adam Thierer have argued most recently in our work about efforts to expand COPPA to cover adolescents (“COPPA 2.0,” which would indirectly mandate age verification for large numbers of adults for the first time).  One might even argue that there are moral reasons to resist the urge to out pseudonymous/anonymous bloggers (just as one might avoid outing closeted gays out of respect for their privacy).   But one need not accept the pernicious idea that the government should punish the outing of peusodonymous/anonymous writers, which is simply a restraint on legitimate free speech.

This exchange, cited by the Gawker article, is particularly interesting, and demonstrates how one can distinguish the question of whether outing is “right” or “appropriate” from the question of whether it should be punished by law:

When the National Review‘s Ed Whelan revealed Publius, who writes for Obsidian Wingsto be a professor of law at the South Texas College of Law named John F. Blevins earlier this month, the palpable online outrage forced Whelan to apologize.

free-range-coverWhen it comes to theories about how to best raise kids, I’m a big believer in what might be referred to “a resiliency approach” to child-rearing.  That is, instead of endlessly coddling our children and hovering over them like “helicopter parents,” as so many parents do today, I believe it makes more sense to instill some core values and common sense principles and then give them some breathing room to live life and learn lessons from it.  Yes, that includes making mistakes.  And, oh yes, your little darlings might actually gets some bump and bruises along the way — or at least have their egos bruised in the process.  But this is how kids learn lessons and become responsible adults and citizens.  Wrapping them in bubble wrap and filling their heads without nothing but fear about the outside would will ultimately lead to the opposite: sheltered, immature, irresponsible, and unprepared young adults — many of whom expect someone else (the government, their college, their employer, or still their parents!) to be there to take care of them well into their 20’s or even 30’s.  Again, you gotta let kids live a little and learn from their experiences.

This explains why I find Lenore Skenazy’s new book, Free-Range Kids: Giving Our Children the Freedom We Had Without Going Nuts with Worry, to be such a breath of fresh air.  [Here’s her blog of the same name.] She argues that “if we try to prevent every possible danger of difficult in our child’s everyday life, that child never gets a chance to grow up.” (p. 5) As she told Salon recently:

You want kids to feel like the world isn’t so dangerous. You want to teach them how to cross the street safely. You want to teach them that you never go off with a stranger. You teach them what to do in an emergency, and then you assume that generally emergencies don’t happen, but they’re prepared if they do. Then, you let them go out.

The fun of childhood is not holding your mom’s hand. The fun of childhood is when you don’t have to hold your mom’s hand, when you’ve done something that you can feel proud of. To take all those possibilities away from our kids seems like saying: “I’m giving you the greatest gift of all, I’m giving you safety. Oh, and by the way I’m taking away your childhood and any sense of self-confidence or pride. I hope you don’t mind.”

Exactly right, in my opinion. Again, let kids live and learn from it.  Teach lessons but then encourage ‘learning by doing’ and let them understand these things for themselves.  That is resiliency theory in a nutshell.

Continue reading →

Cory Doctorow has called for a Wikipedia-style effort to build an open source, non-profit search engine. From his column in The Guardian:

What’s more, the way that search engines determine the ranking and relevance of any given website has become more critical than the editorial berth at the New York Times combined with the chief spots at the major TV networks. Good search engine placement is make-or-break advertising. It’s ideological mindshare. It’s relevance…

It’s a terrible idea to vest this much power with one company, even one as fun, user-centered and technologically excellent as Google. It’s too much power for a handful of companies to wield.

The question of what we can and can’t see when we go hunting for answers demands a transparent, participatory solution. There’s no dictator benevolent enough to entrust with the power to determine our political, commercial, social and ideological agenda. This is one for The People.

Put that way, it’s obvious: if search engines set the public agenda, they should be public.

He goes on to claim that “Google’s algorithms are editorial decisions.”   For Doctorow, this is an outrage: “so much editorial power is better vested in big, transparent, public entities than a few giant private concerns.”

I wish Doctorow well in his effort to crowdsource a Google-killer, but I’m more than a little skeptical that anyone would actually want to use his search engine of The People.  My guess is that, like most things produced in the name of “The People” (Soviet toilet paper comes to mind), it will probably won’t be much fun to use, and will likely chafe noticeably. (For the record, I love and regularly use Wikipedia; I just don’t think that model is unlikely to produce a particularly useful search engine.  As Doctorow himself has noted of Google, “they make incredibly awesome search tools.”)

But I’m glad to see that Doctorow has conceded an important point of constitutional law: The First Amendment protects the editorial discretion of search engines, like all private companies, to decide what to content to communicate.  For a newspaper, that means deciding which articles or editorials to run.  For a library or bookstore, it means which books to carry.  For search engines, it means how to write their search algorithims. Continue reading →

The first meeting of the Online Safety Technology Working Group (OSTWG) took place today and I just wanted to provide interested parties with relevant info and links in case they want to keep track of the task force’s work.  As I mentioned back in late April, this new task force was established by the “Protecting Children in the 21st Century Act,” (part of the ‘‘Broadband Data Improvement Act’,’ Pub. L. No. 110-385) and it will report to the Assistant Secretary of Commerce for Communications and Information at the U.S. Department of Commerce’s National Telecommunications and Information Administration (NTIA).

I’m happy to be serving on this new working group and I am particularly honored to be serving as the chairman of 1 of the 4 subcommittees. The four subcommittees will address: data retention, child pornography, educational efforts, and parental controls technologies. I am chairing that last subcommittee on parental controls.  The task force has about 35 members and we have a year to conduct our research and report back to Congress.  Here are some relevant links from the NTIA website that provide additional details about this task force:

Of course, this is certainly not the first task force to explore online safety issues.  There was the COPA Commission (2000), the “Thornburgh Commission” report (2002), the U.K. “Byron Commission” report (2008), the Harvard Berkman Center’s Internet Safety Technical Task Force (2008), and the NCTA-iKeepSafe-CommonSenseMedia “Point Smart, Click Safe” working group, which is due to issue its final report shortly.  [Full disclosure: I was a member of that last two task forces as well.]  I’m currently working on a short paper that attempts to summarize the remarkably similar findings of these important child safety working groups.  Generally speaking, they all concluded that education and empowerment, not regulation, were the real keys to moving forward and making our kids safer online.

Vision of the Anointed book coverBerin recently encouraged me to re-read Thomas Sowell’s The Vision of the Anointed: Self-Congratulation as a Basis for Social Policy, which I hadn’t looked at since I first read it back in 1995 or 96.   I’m glad I did since Sowell’s work has always been profoundly influential on my thinking (especially his masterpiece, A Conflict of Visions) and I had forgotten how useful The Vision of the Anointed was in helping me understand the reoccurring model that drives ideological crusades to expand government power over our lives and economy.

“The great ideological crusades of the twentieth-century intellectuals have ranged across the most disparate fields,” Sowell noted in the book.  But what they all had in common, he argued, was “their moral exaltation of the anointed above others, who are to have their different views nullified and superseded by the views of the anointed, imposed via the power of government.” (p. 5)  These elitist, government-expanding crusades shared several key elements, which Sowell identified as follows:

  1. Assertion of a great danger to the whole society, a danger to which the masses of people are oblivious.
  2. An urgent need for government action to avert impending catastrophe.
  3. A need for government to drastically curtail the dangerous behavior of the many, in response to the prescient conclusions of the few.
  4. A disdainful dismissal of arguments to the contrary as either uninformed, irresponsible, or motivated by unworthy purposes.

You can see this model at work on a daily basis today with our government’s various efforts to reshape our economy, but I think this model is equally applicable to debates over social policy and speech control.  In particular, the various “technopanics” I have been writing about recently fit this model. (See 1, 2, 3, 4, 5).  For example, consider how this plays out in the debate over online social networking:

Continue reading →

As Berin mentioned last week, we have a new paper out on proposals to expand the Children’s Online Privacy Protection Act (COPPA) of 1998.   We generically refer to those COPPA-expansion efforts as “COPPA 2.0.” Hence, the title of our paper: “COPPA 2.0: The New Battle over Privacy, Age Verification, Online Safety & Free Speech.”  To recap what Berin already noted, in the name of improving online child safety, some legislators and state attorneys general (AGs) are advocating the expansion of COPPA’s “verifiable parental consent” model of age verification before certain sites or services may collect, or enable the sharing of, personal information for children.

Unlike “COPPA 1.0,” however, which only applied to children under the age of 13, “COPPA 2.0” would apply to all minors up to age 17.  Moreover, the range of sites covered by the new law would generally be expanded to include just about any site or service with social networking functionality.

Since Berin has already summarized our general concerns with efforts to expand COPPA’s “verifiable parental consent” online age verification system to cover more online users and sites, I thought I would focus here on what I believe will be the most controversial (and important) part of our paper — our discussion about how COPPA 2.0 affects the speech rights of both adults and adolescents.

Continue reading →

Adam Thierer & I have just released a detailed examination (PDF) of brewing efforts to expand the Children’s Online Privacy Protection Act of 1998 to cover adolescents and potentially all social networking sites—an approach we call “COPPA 2.0.”

As Adam explained on Larry Magid’s CNET podcast, COPPA mandates certain online privacy protections for children under 13, most importantly that websites obtain the “verifiable consent” of a child’s parent before collecting personal information about that child or giving that child access to interactive functionality that might allow the child to share their personal information with others. The law was intended primarily to “enhance parental involvement in a child’s online activities” as a means of protecting the online privacy and safety of children.

Yet advocates of expanding COPPA—or “COPPA 2.0″—see COPPA’s verifiable parental consent framework as a means for imposing broad regulatory mandates in the name of online child safety and concerns about social networking, cyber-harassment, etc. Two COPPA 2.0 bills are currently pending in New Jersey and Illinois. The accelerated review of COPPA to be conducted by the FTC next year (five years ahead of schedule) is likely to bring to Washington serious talk of expanding COPPA—even though Congress clearly rejected covering adolescents age 13-16 when COPPA was first proposed back in 1998.

We’ll discuss some of the key points of our paper in a series of blog posts, but here are the top nine reasons for rejecting COPPA 2.0, in that such an approach would:

  • Burden the free speech rights of adults by imposing age verification mandates on many sites used by adults, thus restricting anonymous speech and essentially converging—in terms of practical consequences—with the unconstitutional Children’s Online Protection Act (COPA), another 1998 law sometimes confused with COPPA;
  • Burden the free speech rights of adolescents to speak freely on—or gather information from—legal and socially beneficial websites;
  • Hamper routine and socially beneficial communication between adolescents and adults;
  • Reduce, rather than enhance, the privacy of adolescents, parents and other adults because of the massive volume of personal information that would have to be collected about users for authentication purposes (likely including credit card data);

Continue reading →

craigslist has filed a complaint against South Carolina Attorney General Henry McMaster, seeking to enjoin him from prosecuting the site for displaying the solicitations to prostitution that sometimes appear there. The complaint cites section 230 of the Communications Decency Act, the First Amendment, and a few other laws that craigslist believes protect it from liability.

The complaint makes a pretty good case that craigslist has taken reasonable steps, working with law enforcement, to keep prostitution off the site. With that it has done its part. If prosecutors want to go after prostitution, they can use craigslist to do so. They should not attack the messenger if consenting adults are trying to exchange money for sexual services in their local areas.