Thoughts on the Future of Content Controls (Both Public & Private)

by on February 23, 2006 · 4 comments

On Tuesday I participated in a very interesting roundtable discussion on the future of content regulation in a multi-media world. The event was held at Yahoo! headquarters in Sunnyvale, CA and it featured representatives from a wide variety of companies and private organizations including: Google, Microsoft, AT&T, Verizon, AOL, Yahoo!, TRUSTe, the Kaiser Family Foundation, and Children Now. Our discussion focused on how to craft workable, private parental controls for Digital Age media content.

The roundtable was hosted by Stephen Balkham, CEO of the Internet Content Rating Association (ICRA). ICRA is an organization which works to create a safer online environment for kids by devising workable screening solutions for parents. In particular, ICRA has been a pioneer in the field of Internet content labeling and filtering. The organization has developed a system of objective content descriptors that website operators and other online media providers can use to label their content. Some of the companies and organizations listed above, as well as many other Internet, media and telecom companies, have already signed agreements with ICRA to use their content labels. Most recently, AT&T and Verizon agreed to use ICRA system to label their content offerings.

The Challenges of Controlling Content in a World of Media Abundance
I kicked off the roundtable with a “50,000-ft.” overview of the challenges that lie ahead. My remarks were drawn from the introduction to my new book on content controls in a world of media convergence. At the conference, as in my book, I put forward the thesis that content regulation, as we have traditionally understood it, is doomed. This is because a confluence of social, legal and, most importantly, technological developments is slowly undermining the ability of legislators and regulators, at all levels of government, to control the nature or quality of media programming. The demise of content controls may take many years–potentially even decades–to play out, but signs of the impending death of the old regulatory regime are already evident.


For example, today’s kids can already use devices such as the PlayStation Portable (PSP), Apple iPods and their cell phones to play games, watch movies, surf the Internet, e-mail or instant message friends, download music and videos, and even watch live TV. What innovative media technologies lie around the corner? Who knows, but I bet they’ll make the PSP, iPod and current cell phones look primitive in fairly short order!

These technologies are prime examples of the “convergence culture” that our kids already live in today. But media convergence is just one of the new marketplace realities that undermine traditional content controls. A second factor at work is the sheer scale, scope and volume of media out there already today. In our generation, we have witnessed the death of geography as a limiting factor in terms of human communications. All media platforms today are being built or reconfigured with the expectation of have a global audience or customer base. Thus, the scale and the scope of media are growing well beyond the reach of regulators.

But that’s not the biggest problem that lawmakers face. It’s the sheer volume of stuff out there that will undermine traditional content controls more than anything else. In a world of organically-generated content, in which every man, woman, and child can be a publisher or broadcaster from their own homes, it would take an army of censors working 24-7 to police all the content out there.

A final factor that is undermining traditional content controls would be the courts. Specifically, I think we are witnessing the beginning of a new appreciation for the First Amendment (again, in the courts, not Congress). Although some technologies (the Internet and video games in particular) are winning more First Amendment freedoms today than others, the courts are sounding increasingly skeptical about traditional regulatory rationales for older media too. Slowly but surely, therefore, the law is being altered in the direction of greater protection of the First Amendment rights of ALL speakers. It’s just a matter of time before the right case comes along that allows the courts to strike at the heart of traditional “scarcity-based” rationales for content controls. (The Red Lion and Pacifica cases are what I’m talking about here).

So, again, the combined effect of these factors will be, in my opinion, the gradually undermining of traditional efforts by governments to control media content. If my thesis is correct then the next question a lot of people (especially parents) will ask is: OK, now what?! After all, it’s not just that policymakers will expect private industry and organizations to step up to the place and take steps to protect children if they can’t, it’s also parents themselves. Indeed, as a parent of two children myself, I find myself facing the same challenge of coping with our wonderful new world of media abundance and convergence.

What sort of private rules, tools or systems can we put in place to help parents as traditional government content restrictions wither away? This was the focus of the ICRA roundtable discussion in California this week.

No Single Solution
The speakers and participants at the event didn’t have all the answers, of course, because there are no perfect silver bullet solutions. Indeed, that was one of three key points I took away from the day’s discussion. But while the participants in the summit agreed that there is no single, simple solution out there, they did outline many of the good systems and tools that are already at our disposal today. We talked about cable and satellite set-top box controls, the V-Chip, various ratings systems (for movies, music, video games and Net content), cell phone and video game blocking technologies, and so on.

But the problem we kept coming back to again and again was that many parents either are not aware of many of these excellent solutions or they simply choose not to use them. Part of the problem, as one participant noted, is that if it’s not incredibly simple, then parents won’t bother even trying to use them. (Certainly our 10-year experience with the V-Chip reflects that fact. It’s not that hard to use the V-Chip, but few parents have ever even bothered to try). Are there ways to make ratings and screening / filtering tools even easier for parents to access and use? That was something that all participants agreed industry needed to work harder at. And more media education was also a hot topic of discussion.

The Problem of Organically-Generated Content
A second major focus of our roundtable discussion involved the issue I raised about the challenge of dealing with a world of organically-generated content where everybody can be a publisher. In the old days, we used to worry mostly about what our kids could see or download, but the bigger problem today is what kids will upload about themselves or others. Not surprisingly, therefore, the MySpace.com was a hot topic of discussion. With digital cameras, camcorder, and cell phone cameras, kids are putting all sorts of stuff on MySpace that they probably shouldn’t. Figuring out how to deal with that is a real challenge.

Who Will be the New “Gatekeepers”?
This led to a long discussion about who the new gatekeepers will be in a world of media abundance. In the old days, the major broadcasters and print outlets played a sort of self-regulatory “gatekeeper” role for media content. They decided when some forms of content “crossed the line” and they self-censored it before the government did. I’m not saying that those were “the good old days” by any stretch of the imagination. I’m just pointing out that many large media intermediaries did screen the bulk of popular media content before anything particularly offensive got out the door. Some call that corporate social responsibility; others think of it as unjust private censorship.

Regardless, how do such “gatekeeper” notions translate to our new multi-media universe? As one representative or a major Internet company noted, “it’s not feasible for us to review everything out there that people post on our systems.” No doubt that is true. But can or should they take steps to screen at least some content? Should MySpace.com, for example, pre-screen pictures uploaded by kids under a certain age? But such a move would not only be highly intrusive but also quite impractical in light of the millions of users at work on the site already. But what if parents contacted the site and requested that material from their child’s personal page be taken down? That’s a more interesting and feasible solution, but it also raises some questions about how that process would work.

Universal Ratings?
Finally, during our discussion about ratings and the various systems already out there today, there was an interesting debate about the possibility and desirability of a more “universal” ratings systems. Is a single ratings scheme for ALL media possible? Several participants had their doubts, but many agreed that part of the frustration that parents have today is dealing with multiple ratings schemes. For example, I like to use the example of the last Star Wars movie, “Revenge of the Sith.” It was rated one way in the cinema (PG-13) and another for the popular video game (T for Teen) even though the two shared a great deal of the same content. Does that make any sense? Perhaps not, but then again, you could also argue that it’s not really a big deal. Parents can easily see how both the movie and the game are rated and probably quickly determine whether it’s appropriate for their child even if those ratings are different. But if parents were relying in automated screening tools to help them block content and multiple ratings or labeling systems are at work, then it’s easy to see how frustrating a world of multiple ratings could become for parents.

Better Labels, Better Tools, and More Information / Education
As more and more media migrates to the Internet and mobile devices, however, almost everyone seemed to agree that more needed to be done to get websites and online content better labeled / rated so that parents have better information or can take steps to automatically block undesirable websites from their children’s media devices. That’s why the ICRA system attracted a great deal of support from most participants. It provides very objective content descriptors that website operators can use to label their sites.

But there is one big problem: How do we get more online media providers to voluntarily adopt such content labels? Are their ways to incentivize people to do so? If we can get many of the biggest media, communications and Internet players to adopt such labels, that will go along way toward getting this movement going. Indeed, this could be the “gatekeeper” role such companies and organizations play in our current media generation. But what about all that organically-generated content out there like blogs, podcasts, and amateur porn sites? How do we get them to self-label? (“Mainstream” porn sites, by the way, have agreed to adopt the ICRA labeling system to allow parents to block their content more easily. So they’re not the problem. It’s the “homegrown porn” that people post freely that will continue to pose the most serious problem.)

Conclusion
In sum, this week’s excellent ICRA roundtable discussion reminded me that we face many serious challenges in terms of finding better ways to help parent shield their children from objectionable content, especially on the Internet. Parents will try to do as much as they can, of course, but many of them are drowning in a world of information overload and are going to expect more help from industry and other organizations. If workable, easy-to-use solutions are not forthcoming, they will start to turn to lawmakers with greater regularity for them to solve this problem. Even though I do not think government-based solutions are workable in our new environment, it would be nice if industry could take more steps to help parents so that they don’t feel the need to run to Uncle Sam to play this role for all of us.

  • http://www.cato.org/people/harper.html Jim Harper

    By my calculation, that was 1,975 words in 93 sentences, for an average of 21.2 words per sentence.

  • http://www.cato.org/people/harper.html Jim Harper

    By my calculation, that was 1,975 words in 93 sentences, for an average of 21.2 words per sentence.

  • Amused

    Jim Harper focuses on the difficult problems of adapting law and policy to the unique problems of the information age. He also counts the number of words Adam uses in his blog posts. The law thing doesn’t take up enough of his time, obviously.

  • Amused

    Jim Harper focuses on the difficult problems of adapting law and policy to the unique problems of the information age. He also counts the number of words Adam uses in his blog posts. The law thing doesn’t take up enough of his time, obviously.

Previous post:

Next post: