Summary of Latest ICRA Summit on Internet Free Expression & Child Protection

by on September 15, 2006 · 2 comments

On Wednesday, I was in New York City attending another installment of the Internet Content Rating Association’s (ICRA) outstanding ongoing series of summits on child protection & freedom of expression in a our new information age. As with previous ICRA events in Washington, Sunnyvale, CA, and Brussels, the focus of the New York roundtable discussion was: What steps can we take to shield children from potentially objectionable Internet or media content without repressing freedom of speech / expression? In particular, the role of private, self-regulation (labeling, rating, filtering, educating, etc) was discussed and debated in detail.

In addition to being the focus of much of my ongoing research at PFF, you might also recall that I wrote about a major summit on similar issues that took part in this June in Washington, D.C., which featured Senator Hillary Clinton among other distinguished speakers. And the Congressional Internet Caucus has an upcoming series of Capitol Hill panel discussions on these issues and just released a compilation of short white papers summarizing what various groups are doing about online child safety issues. So this continues to be a hot topic.


ICRA’s summit in NYC this week once again featured and all-star cast of panelists and other representatives from wide variety of organizations, including: AOL & Time Warner (who hosted the event at their NYC headquarters), Comcast, the Electronic Software Rating Board (ESRB), Microsoft, MPAA, News Corp., British Telecom, AT&T, the Markle Foundation, Progressive Policy Institute, Solarsoft, Digimarc, and many others.

Stephen Balkam, the founder and CEO of ICRA, kicked off the event by noting that the unprecedented explosion of content and new digital media devices has prompted increased legislative and regulatory scrutiny from government officials across the globe. Indeed, the heat is really on in the European Union, Stephen noted, and some over there like EU Commissioner Viviane Reding want to greatly expand tradition government content controls to cover all new forms of digital and online media. (See these three papers by my PFF colleague Patrick Ross for more details on Reding’s efforts in the EU to expand content regulation).

Stephen also reminded the crowd of the significant new regulatory push underway here in the U.S. (see my summary of this activity at the top of this recent paper), including proposals like the new “Internet Safety Act,” which would mandate that a sexually-explicit label be imposed on many Internet websites. And bills are floating in Congress to expand broadcast indecency regulation to cable and satellite television.

Prof. Wu on the Challenges Ahead

Stephen then turned the stage over to Columbia Law School Professor Tim Wu, one of America’s most brilliant legal minds on Internet policy issues (and I say that even though Tim and I often find ourselves on opposite sides of many Internet policy debates). Tim provided a “50,000-foot” overview of the key issues at play in this debate, something he was well-suited to do in light of the book he recently co-authored with Prof. Jack Goldsmith entitled “Who Controls the Internet? Illusions of a Borderless World.”

Prof. Wu began by noting that, in many ways, we’ve been having the same discussion about the same issues for the last ten years. But even though the discussion is still cast in terms of protection of minors vs. protection of freedom of speech / expression, the one thing that has changed over the last few years is that the debate has become much more international in scope. This complicates the debate greatly, Prof. Wu argued, because other countries have different values and policy objectives. And most of them don’t have a First Amendment stopping them from aggressively regulating online speech and expression.

Because of the existence of the First Amendment here in States, however, Wu noted that it is very difficult for government to craft rules aimed at protecting children without running afoul of the Constitution’s protection of freedom of speech / expression for adults. The courts have struck down most legislative enactments on this front on the grounds that government regulation does not represent the “least restrictive means” of going about doing the job in light of the many private filtering tools and parental control technologies that exist in the marketplace. (Incidentally, Prof. Wu thinks the courts have actually gone too far in some cases and used the “least restrictive means” to undermine some rules that he found to be reasonable).

Regardless, Wu argued, the court’s rejection of most legislative enactments has forced the debate to change here in the States for both private and public players. Industry is now expected to step up and do more to offer workable filtering tools and other parental controls. Interestingly, Prof. Wu was concerned that industry may be expected by some to go too far and become increasingly intrusive into online communications in an attempt to police content on their networks. (Wu mentioned how this concern motivated much of the Net neutrality debate, for example). Government officials, on the other hand, now appear to be considering more solutions that seek to empower parents of children instead of just censoring speech outright.

Regardless of how governments here or abroad deal with these issues in coming years, Prof. Wu predicts that we are likely headed toward a major international relations / trade problem in light of how policymakers across the globe are increasingly blazing their own trail and are trying to make the global Internet conform to their local laws and standards. In other words, because each country has its own unique “community standards” that it wishes to preserve or protect, this will give rise to thorny international relations / trade controversies. Prof. Wu foresees a coming “age of trade-offs” where bilateral or multilateral agreements are struck between nations who may be forced to hand each other some concessions to achieve their various goals. For example, Europeans may have to give up trying to regulate online “hate speech” and Americans may have to give up trying to eradicate all online gambling activities. But I found myself wondering: Could the trade-off work in the opposite direction with government doing each other’s dirty work by becoming willing handmaidens of state-enforced morality? Might U.S. officials, for example, agree to pressure domestic firms to clean up online “hate speech” if the EU agrees to help shut down offshore gambling sites, many of which are operated by European companies?

After Prof. Wu finished, three panels followed. In order to encourage an open, frank discussion, however, the rest of the day’s conversation was off-the-record, so I will just summarize some of the major themes that came out of those three panels here.

Traditional Ratings & Empowerment Schemes: Are They Working?

The first panel featured a discussion of how some traditional media operators (motion pictures, television broadcasters, and video game makers) are dealing with new threats to their existing voluntary ratings schemes and empowerment efforts. One common theme that came out this discussion was that private ratings schemes frequently come under fire from a variety of critics for a variety of reasons. Nonetheless, each of the major ratings systems provides a fairly effective way to communicate key information about content to the public while also avoiding government regulation. But the rise of user-generated content (or modifications of existing content) creates complex new challenges since it can’t be rated as easily.

Also, even though there is a lot of awareness about existing ratings systems and parental controls (on TVs, in cable set-top boxes, in video game consoles, etc), many parents never take advantage of them. This continues to be a perplexing problem since industry still gets blamed by many parents when their kids see or hear something that the parents find offensive even though they had the tools and information to make an informed decision on their own. Thus, even though many parents say in polls and surveys that self-regulation is preferable to government regulation, at the end of the day, “a lot of people still want someone else to take responsibility for their kids,” noted one participant.

But both this panel and the following one featured an extended discussion about whether or not there was confusion about ratings in the marketplace. Are there too many systems? Should there be more consistency among them? Can or should they scale internationally? And can we use third-party proxies? That is, can other group’s private ratings be “mapped on” to existing systems? (i.e., Parents Television Council, Common Sense Media, your local church?) It might work for television, some suggested, but it is not likely to for Internet content because it’s easier to deal with handful of intermediaries for major TV programming versus the millions of diffuse creators and distributors of online content.

Regulatory Semantics

This panel also featured an extensive discussion about the recent surge in regulatory activity at the FCC in terms of broadcast “indecency” fines and regulation. Much of this activity is now before the Courts pending legal review, but the panelists and participants engaged in an extended discussion of just how much longer the courts are going to tolerate the unique regulatory treatment of broadcasting relative to all its media competitors. Several participants noted that other countries have been trying to deal with this by expanding content controls or concocting new regulatory distinctions and rationales. For example, is the video “linear” or “non-linear,” or was the content in question “pushed” on consumers or “pulled” down by them voluntarily. But everyone agreed that these semantic debates rapidly become messy, subjective affairs; the bright lines just aren’t there.

Making Metadata Matter

The second and third panels extended the discussion to include the unique difficulties faced by many online operators with the explosion of online activity from across the globe. There was a strong focus on the role of metadata and how increased content tagging by users themselves could go a long way toward empowering consumers to filter potentially objectionable material on their own.

But a major discussion then followed about the fundamentals of metadata. Why haven’t existing metadata solutions gained more traction? How much can metadata cover / describe? Will content creators resist self-labeling or be lazy and just avoid it? What happens with people strip out the metadata by making an analog copy and then re-posting the analog copy online? Will it work effectively in a borderless world, but one with many cultural and linguistic differences? Will government regulate the underlying metadata formulation / definition process? And it’s one thing to deal with metadata tagging for commercial content and another to deal with all the user-generated content out there. (Several participants argued that “community-rated” content solutions may offer new way to address organic, user-generated content. That is, let users rate each others content, much like Xanga does on their social networking site.) There was also a lot of discussion about the importance of Microsoft’s pending launch of its new Vista operating system since it will incorporate extensive parental control tools and metadata-reading capabilities.

The event closed with an open discussion that featured a brief conversation about the role of user and device authentication as a possible solution to some of these problems. Identity & reputation-management is one way that security is handled in other online contexts (consider eBay & Amazon’s user-ratings and reputation systems). But how do we verify the user or reputation for online speech issues? And should we? This is where I jumped in and asked if we really we want to force everyone (including kids) to identify themselves online? Is anonymity dead? And if we require user or device authentication for domestic citizens or websites, might that encourage some of them to flock offshore to the truly shady alleys of the Internet?

Dealing with the “Massive Coordination Problem”

Finally, Tim Wu ended the event with a few concluding remarks and noted that, in his opinion, new solutions will only be effective if we can overcome what he referred to as a “massive coordination problem.” To make new private tools and controls work in this new digital world, he argued, we will need more centralized standards and control at a very basic level. One way to do it, he suggested, was to all get those with “market power” (in the loose sense of the word) in various industries to coordinate their parental control and child protection efforts.

On this point, Tim and I are in close agreement. In fact, I suggested this approach in my recent paper proposing an “Online Code of Conduct.” I firmly believe that–and I certainly recognize that this is an uncomfortable, controversial position–that the “let a thousand flowers bloom” approach to labels, filters, and parental controls is NOT likely to produce effective results that can head-off government regulation well into the future. Instead, we need a focused, tightly coordinated empowerment and education effort that is led by some of the titans of the media, communications and Internet world. If a handful of the largest Digital Economy operators and intermediaries can agree to some simple, user-friendly tools and technologies, then I believe that other, smaller operators will follow their lead since the public is increasingly demanding more effective parental control and child protection systems.

And while we have won many important battles in the courts beating back the regulators, I don’t know how much longer that we can hold them off with just a court-based strategy. We know, for example, that the Department of Justice is still working hard to undermine the efficacy of Internet filters as a superior solution and convince the courts to allow them to enforce the Child Online Protection Act (COPA). If the government somehow convinced the courts that filters were not an effective tool of private content control, then we could be on the verge of a major legislative / regulatory push for more government content regulation in the name of “protecting the children.” That’s why it is important for industry to coordinate and redouble their efforts now to head-off this threat. If the courts see industry stepping-up and doing more, it could help tip the balance in important cases currently pending or coming soon before them for consideration.

In conclusion, this was another outstanding ICRA event and Stephen Balkam is to once again be congratulated for bringing it together and moderating it so effectively. He said that the next major ICRA roundtable will be held in December here in Washington, D.C. I’m very much looking forward to it and promise another write-up like this once after it is complete.

Comments on this entry are closed.

Previous post:

Next post: