Articles by Milton Mueller

Milton MuellerMilton L. Mueller is Professor at Syracuse University's School of Information Studies and XS4All Professor at Delft University of Technology, the Netherlands. He is the author of Networks and States (MIT Press, 2010) and other books. Visit his website.


In her UN General Assembly speech denouncing NSA surveillance, Brazil’s President Dilma Rousseff said:

Information and communications technologies cannot be the new battlefield between States. Time is ripe to create the conditions to prevent cyberspace from being used as a weapon of war, through espionage, sabotage, and attacks against systems and infrastructure of other countries. … For this reason, Brazil will present proposals for the establishment of a civilian multilateral framework for the governance and use of the Internet and to ensure the protection of data that travels through the web.

We share her outrage at mass surveillance. We share her opposition to the militarization of the Internet. We share her concern for privacy.

But when President Rousseff proposes to solve these problems by means of a “multilateral framework for the governance and use of the Internet,” she reveals a fundamental flaw in her thinking. It is a flaw shared by many in civil society.

You cannot control militaries, espionage and arms races by “governing the Internet.” Cyberspace is one of many aspects of military competition. Unless one eliminates or dramatically diminishes political and military competition among sovereign states, states will continue to spy, break into things, and engage in conflict when it suits their interests. Cyber conflict is no exception.

Rousseff is mixing apples and oranges. If you want to control militaries and espionage, then regulate arms, militaries and espionage – not “the Internet.”

This confusion is potentially dangerous. If the NSA outrages feed into a call for global Internet governance, and this governance focuses on critical Internet resources and the production and use of Internet-enabled services by civil society and the private sector, as it inevitably will, we are certain to get lots of governance of the Internet, and very little governance of espionage, militaries, and cyber arms.

In other words, Dilma’s “civilian multilateral framework for the governance and use of the Internet” is only going to regulate us – the civilian users and private sector producers of Internet products and services. It will not control the NSA, the Chinese Peoples Liberation Army, the Russian FSB or the British GCHQ.

Realism in international relations theory is based on the view that the international system is anarchic. This does not mean that it is chaotic, but simply that the system is composed of independent states and there is no central authority capable of coercing all of them into following rules. The other key tenet of realism is that the primary goal of states in the international system is their own survival.

It follows that the only way one state can compel another state to do anything is through some form of coercion, such as war, a credible threat of war, or economic sanctions. And the only time states agree to cooperate to set and enforce rules, is when it is in their self-interest to do so. Thus, when sovereign states come together to agree to regulate things internationally, their priorities will always be to:

  • Preserve or enlarge their own power relative to other states; and
  • Ensure that the regulations are designed to bring under control those aspects of civil society and business that might undermine or threaten their power.

Any other benefits, such as privacy for users or freedom of expression, will be secondary concerns. That’s just the way it is in international relations. Asking states to prevent cyberspace from being used as a weapon of war is like asking foxes to guard henhouses.

That’s one reason why it is so essential that these conferences be fully open to non-state actors, and that they not be organized around national representation.

Let’s think twice about linking the NSA reaction too strongly to Internet governance. There is some linkage, of course. The NSA revelations should remind us to be realist in our approach to Internet governance. This means recognizing that all states will approach Internet regulation with their own survival and power uppermost in their agenda; it also means that any single state cannot be trusted as a neutral steward of the global Internet but will inevitably use its position to benefit itself. These implications of the Snowden revelations need to be recognized. But let us not confuse NSA regulation with Internet regulation.

Remember all the businesses, internet techies and NGOs who were screaming about an “ITU takeover of the Internet” a year ago? Where are they now? Because this time, we actually need them.

May 14 – 21 is Internet governance week in Geneva. We have declared it so because there will be three events in that week for the global community concerned with global internet governance. From 14-16 May the International Telecommunication Union (ITU) holds its World Telecommunication Policy Forum (WTPF). This year it is devoted to internet policy issues. With the polarizing results of the Dubai World Conference on International Telecommunications (WCIT) still reverberating, the meeting will revisit debates about the role of states in Internet governance. Next, on May 17 and 18, the Graduate Institute of International and Development Studies and the Global Internet Governance Academic Network (GigaNet) will hold an international workshop on The Global Governance of the Internet: Intergovernmentalism, Multi-stakeholderism and Networks. Here, academics and practitioners will engage in what should be a more intellectually substantive debate on modes and principles of global Internet governance.

Last but not least, the UN Internet Governance Forum will hold its semi-annual consultations to prepare the program and agenda for its next meeting in Bali, Indonesia. The IGF consultations are relevant because, to put it bluntly, it is the failure of the IGF to bring governments, the private sector and civil society together in a commonly agreed platform for policy development that is partly responsible for the continued tension between multistakeholder and intergovernmental institutions. Whether the IGF can get its act together and become more relevant is one of the key issues going forward.

Continue reading →

On Wednesday, April 10, a bill “to Affirm the Policy of the United States Regarding Internet Governance” was marked up in the U.S. House of Representatives. The bill is an attempt to put a formal policy statement into statute law. The effective part says simply:

It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.

Yet this attempt to formulate a clear principle and make it legally binding policy has become controversial. This has happened because the bill brings to a head the latent contradictions and elisions that characterize U.S. international Internet policy. In the process it has driven a wedge between what was once a unified front by U.S. Democrats and Republicans against incursions into Internet governance by intergovernmental organizations such as the ITU.

The problem, it seems, is that the Democratic side of the aisle can’t bring itself to say that it is against ‘government control’ per se. Indeed, the bill has forced people linked to the Obama administration to come out and openly admit that ‘government control’ of the internet is OK when we exercise it; it’s just those other countries and international organizations that we need to worry about.

Continue reading →

ARIN is the Internet numbers registry for the North American region. It likes to present itself as a paragon of multistakeholder governance and a staunch opponent of the International Telecommunication Union’s encroachments into Internet governance. Surely, if anyone wants to keep the ITU out of Internet addressing and routing policy, it would be ARIN. And conversely, in past years the ITU has sought to carve away some of the authority over IP addressing from ARIN and other RIRs.

But wait, what is this? March 15 the ITU Secretary-General released a preparatory report for the ITU’s World Telecommunications Policy Forum, which will take place in Geneva May 14-16. The report contains 6 Internet-related policy resolutions “to provide a basis for discussion …focusing on key issues on which it would be desirable to reach conclusions.” Draft Opinion #3 pertains to Internet addressing. Among other things, the draft resolves:

  • “that needs-based address allocation should continue to underpin IP address allocation, irrespective of whether they are IPv6 or IPv4, and in the case of IPv4, irrespective of whether they are legacy or allocated address space;
  • “that all IPv4 transactions be reported to the relevant RIRs, including transactions of legacy addresses that are not necessarily subject to the policies of the RIRs regarding transfers, as supported by the policies developed by the RIR communities;”
  • “that policies of inter-RIR transfer across all RIRs should ensure that such transfers are needs based and be common to all RIRs irrespective of the address space concerned.”

These policy positions thrust the ITU and its intergovernmental machinery directly into the realm of IP addressing policy. But that is quite predictable; the ITU has always wanted to do that. What is unusual about these resolutions is that they bear an uncanny resemblance to the policy positions currently advocated by ARIN and the U.S. Department of Commerce.

Continue reading →

A market has developed in which specialized firms discover new vulnerabilities in software and sell that knowledge for tens or hundreds of thousands of dollars. These vulnerabilities are known as “zero day exploits” because there is no advance knowledge of them before they are used. In this blog post, we recognize that this market may require some kind of action, but reject simplistic calls for “regulation” of suppliers. We recommend focusing on the demand side of the market.

Although there is surprisingly little hard evidence of its scope and scale, the market for vulnerabilities is considered troublesome or dangerous by many. While the bounties paid may stimulate additional research into security, it is the exclusive and secret possession of this knowledge by a single buyer that raises concerns. It is clear that when a someone other than the software vendor pays $100,000 for a zero-day they are probably not paying for defense, but rather for an opportunity to take advantage of someone else’s vulnerability. Thus, the vulnerabilities remain unpatched. (Secrecy also makes the market rather inefficient; it may be possible to sell the same “secret” to several buyers.)

The supply side of the market consists of small firms and individuals with specialized knowledge. They compete to be the first to identify new vulnerabilities in software or information systems and then bring them to buyers. Many buyers are reputed to be government intelligence, law enforcement or military agencies using tax dollars to finance purchases. But we know less about the demand side than we should. The point, however, is that buyers are empowered to initiate an attack, a power that even legitimate organizations could easily abuse.

Insofar as the market for exploits shifts incentives away from publicizing and fixing vulnerabilities toward competitive efforts to gain private, exclusive knowledge of them so they can be held in reserve for possible use, the market has important implications for global security. It puts a premium on dangerous vulnerabilities, and thus may put the social and economic benefits of the Internet at risk. While the US might think it has an advantage in this competition, as a leader in the Internet economy and one of the most cyber-dependent countries, it also has the most to lose.

Unfortunately, so far the only policy response proposed has been vague calls for “regulation.” Chris Soghoian in particular has made “regulation” the basis of his response, calling suppliers “modern-day merchants of death” and claiming that “Security researchers should not be selling zero-days to middle man firms…These firms are cowboys and if we do nothing to stop them, they will drag the entire security industry into a world of pain.”

Such responses, however, are too long on moral outrage and too short on hard-headed analysis and practical proposals. The idea that “regulation” can solve the problem overlooks major constraints:

Continue reading →

There are hundreds of applications for generic words in ICANN’s new top level domain program. They include .BOOK, .MUSIC, .CLOUD, .ACCOUNTANT, .ARAB and .ART. Some of the applicants for these domains have chosen to make direct use of the name space under the TLD for their own sites rather than offering them for broad general use. Amazon, for example, would probably make .BOOK an extension of its online bookstore rather than part of a large-scale domain name registration business; Google would probably make .CLOUD an extension of its own cloud computing enterprises.

This is really no different from Barnes and Noble registering BOOK.COM and using it only for its bookstore, Scripps registering FOOD.COM and controlling the content of the site, or CNET registering NEWS.COM and making exclusive use of the site for its own news and advertising. Nor is it terribly different from the .MUSEUM top level domain.

Yet these proposals have generated a loud chorus of objections from competing businesses. They have dubbed these applications ‘closed generics’ and shouted so loud that ICANN is once again considering changing its policies in mid-implementation. ICANN staff has called for public comment and asked specifically whether it should change its rules to determine what is a ‘generic term’ and whether ICANN should enlarge even further its role as as a top-down regulator and dictate whether certain business models can be associated with certain domain names.

A group of Noncommercial Stakeholders Group (NCSG) members have weighed in with some badly-needed disinterested public comment. It isn’t about ‘open’ or ‘closed,’ they maintain, it is about the freedom to innovate.

As NCSG stakeholders, our position is driven neither by paying clients nor by an interest in the success of specific applications. It is based on a principled commitment to the ‘permissionless innovation’ that has made the Internet a source of creativity and growth. Our aim is to maximize the options available to DNS users and to minimize top-down controls. We support the freedom of individuals and organizations to register domains and use them legally in any way they see fit. We support experimentation with new ideas about what a TLD can do. We see no reason to impose ex ante restrictions on specific business models or methods of managing the name space under a TLD.

The group warns ICANN of the danger of giving itself the power to decide what qualifies as a ‘generic word’ and rejects any attempt to retroactively create new policies that would dictate business models for TLD applicants. Hopefully ICANN’s board will be able to look past the self-interested cries of businesses that want to eliminate competitors and consider the public interest in Internet freedom. The comments and list of supporters are available at this link.

In another blog post, I put the International Telecommunication Union’s WCIT into perspective. I ended that discussion with a question that no one else seems to be asking: should there be International Telecommunication Regulations (ITRs) at all? Why do we need them?

I don’t think we do need sector-specific international regulations. I think they can cause more trouble than benefit. To briefly explain why, I noted that every country has its own national regulations regarding interconnection, privacy, antitrust, consumer protection, and so on. Compatibility across platforms and services is much easier technically than it was in the 1930s and before, and tends to get worked out in the market through a variety of bridging technologies and nongovernmental standards forums. International telecommunications is a form of trade in services, and the WTO agreements already provide a sufficient regulatory basis for foreign or multinational providers to enter national markets and offer transnational services. Though not all countries are members of WTO, membership can be expanded and bilateral or regional agreements can supplement it.

Imagine my surprise when someone informed me that the Europeans were calling for the abrogation of the ITRs for exactly those reasons. Apparently they defended that position for years.  But the European drive to get rid of the ITRs was opposed and eventually blocked by — wait for it — the United States of America! The US, I am told, argued that the existing treaty was essential because most of the world’s international communications were regulated by it.

That puts a dramatically new spin on the US’s current campaign to fend off an ITU “takeover” of the Internet. If revision of the ITRs are such a threat to the Internet, why did the US insist on retaining them? If the ITRs are retained, it is inevitable that they would have to be updated and revised. and yet now, the US government is warning us that the revision process poses a major threat to the independence and freedom of the Internet. Something is wrong with this picture.

Most of my information about this is second-hand, from sources that want to remain off the record. But there is proof that the US has defended the importance of the ITRs in an ITU list of documents that can be viewed here. There, in a depository of an ITU expert group that was preparing the grounds for the WCIT, one finds a document submitted by the US entitled the “Continued Critical Role of the ITRs.” Now if you click on the link that I have mischievously placed to that document, you will be taken to a closed, login-required page; before you can read that document, you have to be a TIES member. In other words, this is yet another example of the closed nature of the ITU process. There is another set of papers here that would be of interest in understanding why we even have the ITRs. But they, too, are locked inside TIES.

And that means, this is a job for WCITleaks! The U.S. government should release this document, and if it doesn’t, inside whistleblowers and other people with access to a TIES account need to leak it to us.

ICANN’s plan to open up the domain name space to new top level domains is scheduled to begin January 12, 2012. This long overdue implementation is the result of an open process that began in 2006. It would, in fact, be more realistic to say that the decision has been in the works 15 years; i.e., since early 1997. That is when demand for new top-level domain names, and the need for other policy decisions regarding the coordination of the domain name system, made it clear that a new institutional framework had to be created. ICANN was the progressive and innovative U.S. response to that need. It was created to become a nongovernmental, independent, truly global and representative policy development authority.

The result has been far from perfect, but human institutions never are. Over the past 15 years, every stakeholder with a serious interest in the issue of top level domains has had multiple opportunities to make their voice heard and to shape the policy. The resulting new gTLD policy reflects that diversity and complexity. From our point of view, it is too regulatory, too costly, and makes too many concessions to content regulators and trademark holders. But it will only get worse with delay. The existing compromise output that came out of the process paves the way for movement forward after a long period of artificial scarcity, opening up new business opportunities.

Now there is a cynical, illegitimate last-second push by a few corporate interests in the United States to derail that process. Continue reading →

Paul Vixie, a renowned Internet pioneer who runs the Internet Systems Consortium, has written an article in ACM Queue attacking “those who would unilaterally supplant or redraw the existing Internet resource governance or allocation systems.” The publication of this article is a sign of a growing, important debate around the reform of IP address registries in the age of IPv4 exhaustion.

Vixie defends the Regional Internet Registries monopoly on IP address registration services and its current, needs-based policies toward address transfers. I am sure that Paul sincerely believes in the arguments he makes, but it’s also true that Vixie is the chairman of the Board of The American Registry for Internet Numbers (ARIN), the regional address registry for North America. When Vixie argues that ARIN’s exclusive control over Whois and address transfer services is beneficial and “in the physics” he is also defending the authority and revenue model of his own organization against a perceived threat.

And that takes us to another relevant fact. The argument Vixie makes is cast in generalities, but he is really attacking a specific firm, a holding company known as Denuo. Denuo has formed both a secondary marketplace called Addrex for the legitimate trading of IPv4 number blocks, and an IP Address Registrar company known as Depository. Let’s set aside Depository for the moment (I will come back to it) and concentrate on Addrex, which has become the first end-to-end platform for legacy address holders to sell their IPv4 number blocks. Famously, Addrex scored a major success as the intermediary for the Nortel-Microsoft trade. But Nortel-Microsoft was unusually visible because it had to go through bankruptcy court. Is anything else happening? I spoke to Addrex’s President Charles Lee since then to find out. “We are very busy signing up a growing number of global corporate and governmental customers to sell their unused assets,” he said. I asked him what the buyer side of the marketplace was beginning to look like and he said “Our value proposition to large Asian network operators has resonated quite effectively and we expect to enter into many agreements with them over the coming months.” Surely Vixie and the ARIN Board have gotten wind of this. So when Vixie begins a public attack on this company and its business model, he is signaling to the rest of us that ARIN is worried. Continue reading →

“Global Internet Governance: Research and Public Policy Challenges for the Next Decade” is the title for a conference event held May 5 and 6 at the American University School of International Service in Washington. See the full program here.

Featured will be a keynote by the NTIA head, Assistant Secretary for Commerce Lawrence Strickling. TLF-ers may be especially interested in the panel on the market for IP version 4 addresses that is emerging as the Regional Internet Registries and ICANN have depleted their free pool of IP addresses. The panel “Scarcity in IPv4 addresses” will feature representatives of the American Registry for Internet Numbers (ARIN) and Addrex/Depository, Inc., the new company that brokered the deal between Nortel and Microsoft. There will also be debates about Wikileaks and the future of the Internet Governance Forum. Academic research papers on ICANN’s Affirmation of Commitments, the role of the national governments in ICANN, the role of social media in the Middle East/North Africa revolutions, and other topics will be presented on the second day. The event was put together by the Global Internet Governance Academic Network (GigaNet). Attendance is free of charge but you are asked to register in advance.