Internet Governance & ICANN

Harvard’s Jonathan Zittrain has launched an interesting new project called “HerdictWeb,” which “seeks to gain insight into what users around the world are experiencing in terms of web accessibility; or in other words, determine the herdict.”  It’s a useful tool for determining whether governments are blocking certain websites for whatever reason.  Here’s Zittrain’s sock puppet video with all the details!

The website is quite slick and very user-friendly, and they’ve even created a downloadable Firefox button that will automatically check site accessibility while you’re surfing the Net.

The information gathered from this effort will be useful for the OpenNet Initiative that Zittrain and John Palfrey co-created (with others from Univ. of Toronto, Oxford Univ., and Univ. of Cambridge) and wrote about in their excellent book, Access Denied: The Practice and Policy of Global Internet Filtering, which was one of my favorite technology policy books of the past year.  The data collected will give them, and us, a fuller picture of just how widespread global filtering and censorship efforts really are.  I encourage you to take a look and spread the word, especially to those in foreign countries who could probably use it more than us. (Of course, their governments will likely block Herdict once the word gets around!)

ICANN has just released a second draft of its Applicant Guidebook, which would guide the creation of new generic topmore generic top-level domains (gTLDs) such as .BLOG, .NYC or .BMW. As ICANN itself declared (PDF), “New gTLDs will bring about the biggest change in the Internet since its inception nearly 40 years ago.”  PFF Adjunct Fellow Michael Palage and former ICANN Board member addressed the key problems with ICANN’s original proposal in his  paper ICANN’s “Go/ No-Go” Decision Concerning New gTLDs (PDF & embedded below), released earlier this week.

ICANN deserves credit for its detailed analysis of the many comments on the original draft which Mike summarized back in December.  ICANN also deserved credit for addressing two strong concerns of the global Internet community in response to the first draft:

  • ICANN has removed its proposed 5% global domain name tax on all registry services, something Mike explains in greater detail in his “Go/No-Go” paper.
  • ICANN has commissioned a badly-needed economic study on the dynamics of the domain name system “in broad.” But such a study must address how the fees ICANN collects from specific user communities relate to the actual costs of the services ICANN provides. The study should also consider why gTLDs should continue to provide such a disproportionate percentage of ICANN’s funding—currently 90%—given increasing competition between gTLDs and ccTLDs (e.g., the increasing use of .CN in China instead of .COM).

These concerns are part of a broader debate:  Will ICANN abide by its mandate to justify its fees based on recovering the costs of services associated with those fees, or will ICANN be free to continue “leveraging its monopoly over an essential facility of the Internet (i.e., recommending additions to the Internet’s Root A Server) to charge whatever fees it wants?”  If, as Mike has discussed, ICANN walks away from its existing contractual relationship with the Department of Commerce and claims “fee simple absolute” ownership of the domain name system, who will enforce such a cost-recovery mandate?  

But ICANN simply “kicked the can down the road on the biggest concern”: how to minimize abusive domain name registrations (e.g., cybersquatting, typosquatting, phishing, etc.) and reduce their impact on consumers. Continue reading →

What would it take to create a more secure Internet?  That’s what John Markoff explores in his latest New York Times article, “Do We Need a New Internet?”  Echoing some of the same fears Jonathan Zittrain articulates in his new book The Future of the Internet, Markoff wonders if online viruses and other forms of malware have gotten so out-of-control that extreme measures may be necessary to save the Net.  Compared to when cyber-security attacks first started growing over 20 years ago, Markoff argues that:

[T]hings have gotten much, much worse. Bad enough that there is a growing belief among engineers and security experts that Internet security and privacy have become so maddeningly elusive that the only way to fix the problem is to start over.

Like many others, Markoff fingers anonymity as one potential culprit:

The Internet’s current design virtually guarantees anonymity to its users. (As a New Yorker cartoon noted some years ago, “On the Internet, nobody knows that you’re a dog.”) But that anonymity is now the most vexing challenge for law enforcement. An Internet attacker can route a connection through many countries to hide his location, which may be from an account in an Internet cafe purchased with a stolen credit card. “As soon as you start dealing with the public Internet, the whole notion of trust becomes a quagmire,” said Stefan Savage, an expert on computer security at the University of California, San Diego.

Consequently, Markoff suggests that:

A more secure network is one that would almost certainly offer less anonymity and privacy. That is likely to be the great tradeoff for the designers of the next Internet. One idea, for example, would be to require the equivalent of drivers’ licenses to permit someone to connect to a public computer network. But that runs against the deeply held libertarian ethos of the Internet.

Indeed, not only does it run counter to the ethos of the Net, but as Markoff rightly notes, “Proving identity is likely to remain remarkably difficult in a world where it is trivial to take over someone’s computer from half a world away and operate it as your own. As long as that remains true, building a completely trustable system will remain virtually impossible.”  I’ve spent a lot of time writing about that fact here and won’t belabor the point other than to say that efforts to eliminate anonymity for the entire Internet would prove extraordinarily intrusive and destructive — of both the Internet’s current architecture and the rights of its users.  There’s just something about a “show-us-you-papers,” national ID card-esque system of online identification that creeps most of us out. That’s why I spend so much time fighting age verification mandates for social networking sites and other websites; it’s the first step down a very dangerous road.

But what if we could apply such solutions in a narrower sense?  That is, could we create more secure communities within the overarching Internet superstructure that might provide greater security?  Markoff starts thinking along those lines when he suggests…
Continue reading →

The next several days feature a variety of upcoming events, both on broadband stimulus legislation, and on some of the broader issues associated with the Internet and its architecture.

On Friday, January 30, the Technology Policy Institute features a debate, “Broadband, Economic Growth, and the Financial Crisis: Informing the Stimulus Package,”  from 12 noon – 2 p.m., at the Rayburn House Office Building, Room B369.

Moderated by my friend Scott Wallsten, senior fellow and vice president for research at the Technology Policy Institute, the event features James Assey, Executive Vice President for the National Cable & Telecommunications Association; Robert Crandall, Senior Fellow in Economic Studies, The Brookings Institution; Chris King, Principal/Senior Telecom Services Analyst, Stifel Nicolaus Telecom Equity Research; and Shane Greenstein, Elinor and Wendell Hobbs Professor of Management and Strategy at the Kellogg School of Management, Northwestern University.

The language promoting the event notes, “How best to include broadband in an economic stimulus package depends, in part, on understanding two critical issues: how broadband affects economic growth, and how the credit crisis has affected broadband investment.  In particular, one might favor aggressive government intervention if broadband stimulates growth and investment is now lagging.  Alternatively, money might be better spent elsewhere if the effects on growth are smaller than commonly believed or private investment is continuing despite the crisis.”

And then, on Tuesday,  MIT Professor David Clark, one of the pioneers of the Internet and a distinguished scientist whose work on “end-to-end” connectivity is widely cited as the architectural blueprint of the Internet, looks to the future.  Focusing on the dynamics of advanced communications – the role of social networking, problems security and broadband access, and the industrial implications of network virtualization and overlays – Clark here tackles new forces shifting regulation and market structure.

David Clark is Senior Research Scientist at the MIT Computer Science and Artificial Intelligence Laboratory. In the forefront of Internet development since the early 1970s, Dr. Clark was Chief Protocol Architect in 1981-1989, and then chaired the Internet Activities Board. A past chairman of the Computer Science and Telecommunications Board of the National Academies, Dr. Clark is co-director of the MIT Communications Futures Program.

I’m no longer affiliated with the Information Economy Project at George Mason University, but I urge all interested in the architecture of the Internet to register and attend More information about the lecture, and about the Information Economy Project, is available at http://iep.gmu.edu/davidclark.

It will take place at the George Mason University School of Law, Room 120, 3301 Fairfax Drive, Arlington, VA 22201 (Orange Line: Virginia Square-GMU Metro), on Tuesday, February 3, from 4 – 5:30 p.m., with a reception to follow. The event is free and open to the public, but reservations are requested. To reserve a spot, please e-mail iep.gmu@gmail.com

Post Jeffersons MooseI used to have a (semi-crazy) uncle who typically began conversations with lame jokes or bad riddles. This sounds like one he might have used had he lived long enough: What do Thomas Jefferson, a moose, and cyberspace have in common?

The answer to that question can be found in a new book, In Search of Jefferson’s Moose: Notes on the State of Cyberspace, by David G. Post, a Professor of Law at Temple University. Post, who teaches IP and cyberspace law at Temple, is widely regarded as one of the intellectual fathers of the “Internet exceptionalist” school of thinking about cyberlaw.  Basically, Post sees this place we call “cyberspace” as something truly new, unique, and potentially worthy of some special consideration, or even somewhat different ground rules than we apply in meatspace. More on that in a bit.

[Full disclosure: Post’s work was quite influential on my own thinking during the late 1990s, so much so that when I joined the Cato Institute in 2000, one of the first things I did was invite David to become an adjunct scholar with Cato. He graciously accepted and remains a Cato adjunct scholar today. Incidentally, Cato is hosting a book forum for him on February 4th that I encourage you to attend or watch online. Anyway, it’s always difficult to be perfectly objective when you know and admire someone, but I will try to do so here.]

Post’s book is essentially an extended love letter — to both cyberspace and Jefferson. Problem is, as Post even admits at the end, it’s tough to know which subject this book is suppose to teach us more about. The book loses focus at times — especially in the first 100 pages — as Post meanders between historical tidbits of Jefferson’s life and thinking and what it all means for cyberspace. But the early focus is on TJ.  Thus, those who pick up the book expecting to be immediately immersed in cyber-policy discussions may be a bit disappointed at first.  As a fellow Jefferson fanatic, however, I found all this history terrifically entertaining, whether it was the story of Jefferson’s Plow and his other agricultural inventions and insights, TJ’s unique interest in science (including cryptography), or that big moose of his.

OK, so what’s the deal with the moose? When TJ was serving as a minister to France in in the late 1780s, at considerable expense to himself, he had the complete skeleton, skin and horns of a massive American moose shipped to the lobby of his Paris hotel. Basically, Jefferson wanted to make a bold statement to his French hosts about this New World he came from and wake them up to the fact that some very exciting things were happening over there that they should be paying attention to. That’s one hell of way to make a statement!

Continue reading →

I’ve been working closely with PFF’s new Adjunct Fellow Michael Palage on ICANN issues.  Here is his latest note, from the PFF blog.

ICANN recently proclaimed that the “Joint Project Agreement” (one of two contractual arrangements that ICANN has with the U.S. Department of Commerce (DoC) governing ICANN’s operations) will come to an end in September 2009. ICANN’s insistence on this point first became clear back in October 2008 at ICANN’s Washington, D.C. public forum on Improving Institutional Confidence when Peter Dengate Thrush, Chair of ICANN’s Board declared:

the Joint Project Agreement will conclude in September 2009. This is a legal fact, the date of expiry of the agreement. It’s not that anyone’s declared it or cancelled it; it was set up to expire in September 2009.

ICANN’s recently published 2008 Annual Report stuck to this theme:

“As we approach the conclusion of the Joint Project Agreement between the United States Department of Commerce and ICANN in September 2009…” – His Excellency Dr. Tarek Kamel, Minister of Communications and Information Technology, Arab Republic of Egypt

“Concluding the JPA in September 2009 is the next logical step in transition of the DNS to private sector management.” – ICANN Staff

“This consultation’s aim was for the community to discuss possible changes to ICANN in the lead-up to the completion of the JPA in September 2009.” – ICANN Staff

ICANN’s effort to make the termination of the JPA seem inevitable is concerning on two fronts. First, ICANN fails to mention that the current JPA appears to be merely an extension/revision of the original 1998 Memorandum of Understand (MoU) with DoC, which was set to expire in September 2000. Thus, because the JPA does not appear to be a free-standing agreement, but merely a continuation of MOU-as Bret Fausset argues in his excellent analysis of the relationship between the MoU and the JPA (also discussed by Milton Mueller). Therefore, it would be more correct to talk about whether the “MoU/JPA”-meaning the entire agreement as modified by the most current JPA-will expire or be extended. Continue reading →

Mike Palage, the first Adjunct Fellow at PFF’s Center for Internet Freedom, just published the following piece on the PFF blog.

ICANN‘s plan to begin accepting applications for new generic top-level domains (gTLDs) in mid-2009 may have been derailed by last week’s outpouring of opposition from the global business community and the United States Government (USG). Having been involved with ICANN for over a decade and having served on its Board for three years, I’ve never seen such strong and broad opposition to one of ICANN’s proposals.

This past June, the ICANN Board directed its staff to draft implementation guidelines based upon the policy recommendations of the Generic Names Supporting Organization (GNSO) that ICANN should allow more gTLDs such as .cars to supplement existing gTLDs such as .com. In late October, the ICANN staff released a draft Applicant Guidebook detailing its proposal. The initial public forum on this proposal closed on December 15-with over 200 comments filed online.

In its December 18 comments, the USG questioned whether ICANN had adequately addressed the “threshold question of whether the consumer benefits outweigh the potential costs.” This stinging rebuke from the Commerce Department merely confirms the consensus among the 200+ commenters on ICANN’s proposal: ICANN needs to do more than merely rethinking its aggressive time-line for implementing its gTLD proposal or tweaking the mechanics of the proposal on the edges. Instead, ICANN needs to go back to the drawing board and propose a process that results in a responsible expansion of the name space, not merely a duplication of it.

Continue reading →

The intrepid Chris Soghoian has turned up an important wrinkle in Google’s services. Google pulled his AdWords ad pointing out AT&T’s campaign contributions to an Indiana politician after AT&T lodged a trademark complaint about it.

Trademark law is for preventing confusion about the source of goods and services. There is no possibility that Chris’ ad would confuse consumers in this way. He’s not providing telecommunications services, and his ad didn’t suggest it. Chris’ use of “AT&T” did not violate AT&T’s trademarks.

The subject matter of Chris’ ad is an important part of our national discourse, and something people should be able to run ads about on a platform like Google. It would be, well, evil, to kick small public policy advocates to the curb in favor of big corporations.

A company like Google is in a tough spot, of course, trying to adjudicate trademark claims at scale. But it is not acceptable to treat trademark complaints as proven just for having been submitted.

Google should take some steps to make its process more fair, such as by allowing advertisers to respond to a trademark complaint before Google acts on it. Much of the process could be automated, and it could explain to both sides what trademark rights include – and what they don’t. If after a few automated steps, the two remained at loggerheads, Google employees could take a look to see whether the claim or the response were meritorious. (A trained monkey could have determined that Chris’ ad is not a trademark violation.)

In close cases, Google should leave it to the parties to resolve, while it works in the courts to generate a substantive body of law that service providers in the position of Google are not properly liable for the trademark infringements of users. (My brief pitch for common law findings of “no liability” in such situations – as opposed to statutory protections like CDA section 230 – starts at minute 22 of this video.)

Would these ideas increase Google’s cost and potential liability? Yes, some. But Google should embrace those costs as it educates its users, employees, courts, and – most important – trademark holders about what trademark does and does not do.

Kudos to Chris for his tenacity. Google, fix this.

The 3rd meeting of the United Nation’s Internet Governance Forum (IGF) met this week in Hyderabad, India. One of the concerning takeaways is the increased posturing by governments to assert greater control over  the Internet.

For the uninitiated, the IGF is an outgrowth of the World Summit on the Information Society (WSIS), and is meant to be a multi-stakeholder “talk shop” on public policy issues related to the development and governance of the Internet. It’s the forum for governments and social policy agendas, whereas ICANN is meant to be a technical body for coordinating the Internet’s naming system.

The U.S. had advocated for a minimal role for the United Nations and IGF, while many governments want to assert more control then they possess at ICANN. A compromise was struck at the final WSIS meeting in Tunis – “Enhanced Cooperation” – in order to defer choosing between existing or new mechanisms.

As my colleague Steve DelBianco describes it, it’s sort of like the way he handled his teenage son when he nagged him about getting a new car to drive:  work on ‘Enhanced Transportation’ instead.

Steve and NetChoice work to avoid a new mechanism for Internet Governance that’s designed by, and for, governments. Instead, preferring on Enhanced Cooperation within existing mechanisms.

Yet there’s danger on the horizon. My colleague Mark Blafkin reports in this blog post that at the Hyderabad meeting, politicians were spouting populist rhetoric about returning control of the Internet to “the people.”

Everton Lucero, the Brazilian representative to ICANN’s Government Advisory Council (GAC) delivered a beautiful speech filled with inspiring rhetoric about returning Internet Governance back to the concept of “We the People” and taking the power out of the hands of the “nobles and landlords.”  Unfortunately, that is all it was: a beautiful speech that ignored reality in an attempt to grab the power to control the Internet and censor content. Brazil’s government has shown an increasing distaste for Freedom of Speech, especially on the Internet.  The government had a recent documentary exposing some of the most egregious efforts at political censorship of the press pulled from local television.

Continue reading →

Over the past year, I have been monitoring a very interesting trend with important ramifications for the future of Internet policy. State Attorneys General (AGs) — often in league with the National Center for Missing and Exploited Children (NCMEC) — have been striking a variety of “voluntary” agreements with various Internet companies that deal with child safety concerns or other online issues. These agreements require the companies involved to take various steps to alter site architecture and functionality, commit to stop certain practices, or take steps to block certain users (ex: predators; escort services) or types of content (ex: child porn; online “discrimination”) altogether.

To begin, let me be very clear about one thing: Some of these activities or types of content warrant a law enforcement response. That is certainly the case with child pornography or predation, for example. However, as I will note down below, there is a legitimate question about whether state officials and a non-profit private organization should be crafting legal or regulatory policies to address such concerns for a global medium like the Internet. Regardless, these agreements are creating a new layer of Internet regulation (almost extra-legal in character) that is worthy of exploration.

First, let me itemize some of these recent “voluntary” agreements between Internet companies and the AGs and/or NCMEC:

Continue reading →