Lessig vs. Rosen on Net Porn Regulation

by on September 16, 2004

I’ve recently read two very important columns on the regulation of online pornography that I want to bring to everyone’s attention. The first is a column entitled “Free Porn” by Professor Lawrence Lessig in this month’s Wired magazine that I find deeply troubling.

The second is a impressive new essay in The New Atlantis entitled “The End of Obscenity” by Jeffrey Rosen. Rosen is the legal affairs editor of The New Republic, and a professor of law at the George Washington University Law School. Rosen’s essay is must reading for anyone still searching in vain for a way to censor online pornography. But let’s begin with Lessig’s new article.


What’s Left for Congress in a Post-COPA World?

In his latest Wired column, Lessig notes that the Supreme Court recently rejected the Child Online Protection Act’s (COPA) approach to banning the transmission across the Internet of any material deemed “harmful to minors” to anyone under the age of 18. The Court held that if the government cannot show that “less restrictive alternatives” (like software filters) can’t do the same job just as well, then the First Amendment prevents Congress from regulating online speech. As Lessig correctly notes, “That’s an impossibly difficult standard to meet, meaning Internet porn (though not obscenity) now lives in a regulation-free zone.”

While Lessig argues that the Court was correct in striking down COPA as unconstitutional, he argues that “we need something smarter than this schizophrenic approach to regulate speech on the Internet. For the Internet changes things. And sometimes it needs a little regulation to protect liberty better.”

Importantly, Lessig worries that private Internet content filters–which the Supreme Court specifically cited as a “less restrictive alternative” to government regulation–may actually be worse than Net regulation because private filters are “idiotic” and “block all sorts of speech that should not be blocked and skip all sorts of speech that should. Worse, their errors are not subject to review: If a product like Net Nanny blocks a gay rights site, there’s nothing the ACLU can do about it. Net Nanny’s censorship is free of the First Amendment. Demand for filters would fall, however, if the government enacted effective rules that enabled parents to block porn from kids. Why buy Net Nanny software for $40 when you can get the same protection through regulation for free?”

Brown Bags for Cyberspace?

This leads Lessig to ponder the prospect of a sort of “brown-bag” rule for cyberspace as a potential compromise between COPA and private filtering. That is, some municipal goverments have implemented zoning ordinances dictating how newsstands, adult books shops or video stores can (or rather, cannot) display materials of a purient nature. Thus, Hustler has to be in a brown wrapper, covered behind some black plastic “blinders,” placed the top shelf out of junior’s reach, or all of the above. And video stores can’t play porn movies in their windows for any passers-by to see.

How, Lessig asks both in his recent Wired column as well as in his old book Code, might government construct such “a simple law” for cyberspace? In Chapter 12 of Code, he spends several pages discussing possible “architectures that zone speech.” In his Wired column he elaborates on how such zoning laws might apply to cyberspace:

“What rules would be effective? Imagine a simple requirement that commercial Web sites carrying material deemed ‘harmful to minors’ mark that content with a newly minted metadata tag – say, < porn >. (Obviously, the details of making this system work are another matter.) The tag could be read by HTML-rendering software but would be invisible to users. It would divide the Net into zones the way the V-chip was meant to divide up television. If such a rule were effectively enforced, it would spur the market to supply browsers that parents could use to block < porn >-tagged content, essentially creating “kids-mode” browsing. Developers would thus supply a technology inspired by law to achieve a policy that better protects speech than no law at all.”

Lessig goes on to argue that “tagging” is a sort of cyber-zoning requirement that could pass constituional muster:

“And what of the First Amendment? The rule wouldn’t burden porn-consuming adults at all. It would burden only slightly parents who install Web browsers to detect the porn tag. And it would burden content providers. First, with the technical requirement that they tweak their code to tag porn, and, more significantly, with a second requirement that they judge whether their material is in fact ‘harmful to minors.’ That last burden is very real. But the same burden already applies (constitutionally) to booksellers by laws that prohibit them from selling porno mags to kids. If Congress crafted the rule narrowly enough, it would survive even this Court’s most stringent review. And if it stanched demand for idiot filters, then it would protect free speech more than no rule at all. Both the Court and free speech advocates could defend this regulation – as vigorously as they defend pornographers.”

Opening the Door to Net Censorship

Hmmm… well, I’m a free speech advocate and I’m not about to vigorously defend this regulatory scheme. (Nor, for that matter, do I “vigorously defend pornographers,” but more on that point at the end of this rant). I have problems with Lessig’s approach for both practical and principled reasons.

Practically speaking, government can’t really “zone” cyberspace the same way it zones the corner newsstand since cyberspace isn’t a tangible medium, and, more importantly, it is a global communications medium that defies traditional community boundaries. Therefore, applying some variant of the Supreme Court’s traditional “contemporary community standards” test from Miller v. California is rife with problems. (The old Miller obscenity test asked: (1) whether “the average person, applying contemporary community standards” would find that the work, taken as a whole, appeals to the prurient interest; (2) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and (3) whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.”)

Moreover, Lessig says his regulatory tagging approach “would divide the Net into zones the way the V-chip was meant to divide up television.” As I wrote recently, only a small percentage of the population is using the V-Chip and, precisely because of that, policymakers are now proposing to up the ante by imposing more direct content controls on “excessively violent” programming on television. And, again, the notion that we can “divide the Net into zones” is much easier said than done. TV is an old and still fundamentally stupid (in a technological sense) medium. It is ubiquitous and successful precisely because it is so easy to setup and operate. That also makes it relatively easier to regulate in comparison to the Internet and cyberspace. “Zoning” content and websites globally will be a duanting task. Thus, I don’t think the V-Chip is the sensible regulatory model for cyberspace that Prof. Lessig suggests it is.

But most importantly, any cyber-zoning scheme will require that someone in government attempt to define what “harmful to minors” should mean for purposes of “tagging” the websites to be screened or filtered. Again, given the Internet’s global reach and the impossibility of quarantining online speech geographically, it is unclear how a “contemporary community standards” test can ever be applied unless “the community” was potentially redefined as the entire United States. (More on this below when we turn to Rosen’s essay). There are real dangers associated with such tagging or zoning regulations. Here’s how my friend Robert Corn-Revere, one America’s finest First Amendment lawyers, describes the problem with Lessig’s “simple law” approach:

“Given the standards of the many communities that will be reached, publishers on the Internet must anticipate what might be considered “harmful to minors” in each of them and to ‘zone’ their speech accordingly. This is particularly problematic if speakers face any type of sanction if they fail to ‘properly’ label or zone their expression. I would expect many to comply either by zoning speech according to their best guess as to the standard of the least tolerant community, or simply to restrict what information they make available.”

In other words, if Congress attempts to preserve a Miller “contemporary community standards” test and extend it to the Internet for purposes of regulating online speech, it would lead to a “lowest common denominator” system of de facto Net regulation since operators would need to conform their speech to the most puritanical communities in America to play it safe. Needless to say, that raises some serious constitutional issues and creates a variety of practical problems for many website operators.

Again, there is the alternative of moving away from the “contemporary community standards” test and toward a new national standard for this global medium. Here’s where we need to turn to Jeff Rosen’s new article to explain why that is not a workable solution either and why we should reject Lessig’s proposal on grounds of principle.

What Rosen Teaches Us

Rosen’s new essay on “The End of Obscenity” begins by noting that although this latest COPA decision was the third time in less than eight years that the Supreme Court has grappled with online speech issues, the justices have again dodged the fundamental question raised in these cases: “Is there a coherent category of speech on the Internet that can be regulated as obscene?” Apparently not, he says, “For it is increasingly obvious, as lower courts have recognized, that the exploding demand for Internet pornography and the impossibility of restricting it to any geographic area makes the Supreme Court’s traditional tests for defining obscenity incoherent. Rather than encouraging Congress to search for more effective technologies for controlling obscene speech, the Court will eventually have to recognize that the effort to regulate obscenity has been doomed by culture, by technology, and by the Court’s own increasingly expansive embrace of individual autonomy as the highest good.”

Rosen notes that case law on this front has always been a mystifying mess, especially the Miller test. Things were confusing enough in 1973 before the dawn of the VCR and the rise of the Net, but after those developments, applying “community standard” tests for obscenity to increasingly national technologies and forms of commerce would become impossible. “[A] national consensus about obscenity is a fantasy,” he argues.

And here’s where Rosen boldly goes where no man has gone before in terms of laying out the facts about pornography in America and explaining why we must come to grips with the reality that porn is part of the American way of life whether some of us like it or not. He sites a litany of statistics about porn consumption by average Americans, including the fact that one fourth of all search engine requests every day (68 million) are for pornographic material. There’s simply no way that an industry can generate $10 billion per year in revenues and have more than 70,000 websites unless it is a MASS phenomenon that has a constituency in every city of this nation. “Pornography is everywhere, suggesting that there is no national consensus against it and no vast disparity from one locale to another,” argues Rosen.

Thus, he continues, “the idea that it clearly violates national community standards is a hypocrisy that can no longer be sustained in light of clickstream data and consumption statistics.” Luckily, with the latest COPA decision, Rosen notes that the court is (perhaps somewhat reluctantly) moving in this direction. He argues that the justices have come to apply John Stuart Mill’s harm principle (that private behavior should not be regulated absent harm to others) to the matter of online obscenity / porn regulation. Consequently, he notes, “the foundations of [the Court’s] obscenity jurisprudence are on extremely shaky ground.” In particular, the “community standards” approach is doomed since both cultural and technological evolution have made it increasing irrelevant or unworkable.

Are Parents Powerless?

Rosen’s sobering but accurate appraisal of the current state of obscenity jurisprudence will leave many lawmakers and citizens (especially parents) wondering: Isn’t there anything we can do about all this porn on the Net and the filth that my kids might access online?

There’s no denying the fact that monitoring a child’s viewing and listening habits in today’s modern media marketplace is a daunting task. Their eyes and ears are bombarded by a kaleidoscope of images and a cacophony of sounds, but such is the nature of our modern world and parents need to learn to deal with it. As the parent of two small children, I’ve been giving this issue more thought than ever before and I have to admit that there are times when even an old free speech radical like myself can begin to wonder if a helping hand from the government is needed here to make my job easier.

The problem with speech and content controls, however, is always one of the slippery slope. I’m always careful about making slippery slope arguments after hearing a brilliant lecture on the dangers of slippery slope reasoning by Eugene Volokh of UCLA. (See his Harvard Law Review essay on the subject, “The Mechanisms of the Slippery Slope.” In my opinion, this piece is destined to be a classic.) But I think the danger of a slippery slope here is very real since lawmakers have, throughout history, exhibited an insatiable appetite to censor what society can see or hear. If we give them inch, they will take a mile and potentially much, much more. Even a “simple law” along the lines that Lessig suggests might lead to more widespread censorship of the media in ways we cannot predict today. Just as the “simple law” of the V-Chip is now leading to more widespread calls for regulation of “excessive violence” on television and cable, Lessig’s tagging approach will likely proved sufficiently flawed that Congress will come back again and say additional content controls are needed “for the children” or “to empower parents.”

When outlining his “simple requirement” that commercial Web sites carrying material deemed “harmful to minors” mark that content with a metadata tag, Lessig at least acknowledges that the devil may be in the details. He says in one short, bracketed line: “Obviously, the details of making this system work are another matter.” Well I’m glad he’s at least willing to concede that there may be a bit of a problem here! Leaving it up to someone in Congress or the FCC to decide what is “harmful to minors” opens up a very big can of worms. Who knows where it might end.

In the end, however, even if there is no slippery slope associated with Lessig’s proposed rule, there are other reasons to oppose it. Namely, much like democracy is the worst form of government except all the others, those private filtering solutions (while flawed) are better than any of the other government solutions out there. Better that private filtering systems improperly tag certain content than to have a mandatory government metadata tag doing so. Generally speaking, it is easier to correct the flaws of voluntary private filtering systems than coercive public regulatory regimes. (For all the grief that Prof. Lessig gives America’s increasingly regulatory-minded copyright system in his new book Free Culture–much of which I agree with, by the way–it is surprising to me that he is willing to place so much faith it a different regulatory process also dealing with content. What gives Larry?)

It’s Parental Responsibility, Stupid

Finally, we need to return to first principles here and not lose sight of what this debate is really all about: personal and parental responsibility. Regardless of how difficult is to police our children’s viewing and listening habits in our modern world, that should not serve as an excuse to call in government to play the role of surrogate parent on our behalf. Beyond using private filters in the home, here are a few other suggestions I’ve come across to help parents like me cope:

(1) Only allow your child on the Web during certain pre-determined hours of the day and only for certain limited durations. Encourage them to use the extra time to read a good book or go out and get a little good old fashion exercise.

(2) Establish one computer in the home as the kids computer that they can use and explain to them that the others will be off limits (and then password-protect the others). Install all the relevant filters and controls on their machine and then also place it somewhere in the home where Mom and Dad can always have their eye on it. For example, a good friend of mine chooses to only have one computer in his home for his family and places it on a desk in the corner of the living room with the monitor facing out for all to see. (Mom can see the monitor from the kitchen and Dad can see it from his Lazy-Boy). This makes it far less likely junior will stray too far down the wrong path in cyberspace. If they do, Mom and Dad will know right away and talk to them about it.

(3) If you establish an e-mail account for your child, explain to them there will be limits on its use just as our parents placed limits on the use of the phone in the old days. You might also check their e-mail for them before they get on to see if all pornographic Spam is being filtered out.

(4) Finally, before you allow your child to spend one second on the Web, sit down with them and have a mature talk about the fact that there are a lot of weird, demented, even sick things out there in cyberspace that they might stumble upon, just as they might in the real world. Explain to them that this is not necessarily the way the whole rest of the world works and that sometimes people say and do stupid things for reasons that are difficult to understand. Most importantly, tell them to feel free to ask Mom or Dad questions about what they see or hear on the Net (and on TV or radio too) and explain to them in a loving fashion that you will not bite their heads off for any questions they may ask, no matter how sensitive those questions might be.

Again, there’s a term for what I’m describing here: parental responsibility. It didn’t used to be such a radical concept, but these days I fear that parents have grown increasingly lazy and are openly invited Uncle Sam into their homes to do this job for them. My counter-proposal is simple: When it comes to minding the kids, I’ll take responsibility for teaching mine about the realities of this world, including the unsavory bits. You worry about your own. Let’s not call in the government to do this job for all of us (as well as the millions of American who don’t even have any children!)

One final note regarding the comment made by Lessig in his recent Wired essay about free speech advocates vigorously defending pornographers. This is the the sort of shameful rhetoric I’ve come to expect from social conservatives I debate on this issue, but not someone like Prof. Lessig. You will not find a more passionate defender of free speech than me, but I don’t spend a minute of time vigorously defending pornographers. This isn’t about defending pornographers Larry, it’s about defending freedom and personal responsibility over government censorship. I’m sorry you’re on the wrong side.

Comments on this entry are closed.

Previous post:

Next post: