Milton Mueller – Technology Liberation Front https://techliberation.com Keeping politicians' hands off the Net & everything else related to technology Thu, 11 Aug 2016 17:37:20 +0000 en-US hourly 1 6772528 The Politics of the ICANN transition https://techliberation.com/2016/08/11/the-politics-of-the-icann-transition/ https://techliberation.com/2016/08/11/the-politics-of-the-icann-transition/#comments Thu, 11 Aug 2016 17:37:20 +0000 https://techliberation.com/?p=76065

One would think that if there is any aspect of Internet policy that libertarians could agree on, it would be that the government should not be in control of basic internet infrastructure. So why are Tech Freedom and a few other so-called “liberty” groups making a big fuss about the plan to complete the privatization of ICANN? The IANA transition, as it has become known, would set the domain name system root, IP addressing and Internet protocol parameter registries free of direct governmental control, and make those aspects of the Internet transnational and self-governing.

Yet, the same groups that have informed us that net neutrality is the end of Internet freedom because it would have a government agency indirectly regulating discriminatory practices by private sector ISPs, are now trying to tell us that retaining direct U.S. government regulation of the content of the domain name system root, and indirect control of the domain name industry and IP addressing via a contract with ICANN, is essential to the maintenance of global Internet freedom. It’s insane.

One mundane explanation is that TechFreedom, which is known for responding eagerly to anyone offering them a check, has found some funding source that doesn’t like the IANA transition and has, in the spirit of a true political entrepreneur, taken up the challenge of trying to twist, turn and spin freedom rhetoric into some rationalization for opposing the transition. But that doesn’t explain the opposition of Senators Cruz and other conservatives who feign a concern for Internet freedom. No, I think this split represents something bigger. At bottom, it’s a debate about the role of nation-states in Internet governance and the state’s role in preserving freedom.

In this regard it would be good to review my May 2016 blog post at the Internet Governance Project, which smashes the myths being asserted about the US government’s role in ICANN. In it, I show that NTIA’s control of ICANN has never been used to protect Internet freedom, but has been used multiple times to limit or attack it. I show that the US control of the DNS root was never put into place to “protect Internet freedom,” but was established for other reasons, and that the US explicitly rejected putting a free expression clause in ICANN’s constitution. I show that the new ICANN Articles of Incorporation created as part of the transition contain good mission limitations and protections against content regulation by ICANN. Finally, I argued that in the real world of international relations (as opposed to the unilateralist fantasies of conservative nationalists) the privileged US role is a magnet for other governments, inviting them to push for control, rather than a bulwark against it.

Another libertarian tech policy analyst, Eli Dourado, has also argued that going ahead with the IANA transition is a ‘no-brainer.’

Assistant Secretary of Commerce Larry Strickling’s speech at the US Internet Governance Forum last month goes through the FUD being advanced by TechFreedom and the nationalist Republicans one by one. Among other points, he contends that if the U.S. tries to retain control, Internet infrastructure will become increasingly politicized as rival states, such as China, Russia and Iran, argue for a sovereignty-based model and try to get internet infrastructure in the hands of intergovernmental organizations:

Privatizing the domain name system has been a goal of Democratic and Republican administrations since 1997. Prior to our 2014 announcement to complete the privatization, some governments used NTIA’s continued stewardship of the IANA functions to justify their demands that the United Nations, the International Telecommunication Union or some other body of governments take control over the domain name system. Failing to follow through on the transition or unilaterally extending the contract will only embolden authoritarian regimes to intensify their advocacy for government-led or intergovernmental management of the Internet via the United Nations.

The TechFreedom “coalition letter” raises no new arguments or issues – it is a nakedly political appeal for Congress to intervene to stop the transition, based mainly on partisan hatred of the Obama administration. But I think this debate is highly significant nevertheless. It’s not about rational policy argumentation, it’s about the diverging political identity of people who say they are pro-freedom.

What is really happening here is a rift between nationalist conservativism of the sort represented by the Heritage Foundation and the nativists in the Tea Party, on the one hand, and true free market libertarians, on the other. The root of this difference is a radically different conception of the role of the nation-state in the modern world. Real libertarians see national borders as, at best, administrative necessary evils, and at worst as unjustifiable obstacles to society and commerce. A truly classical liberal ethic is founded on individual rights and a commitment to free and open markets and free political institutions everywhere, and thus is universalist and globalist in outlook. They see the economy and society as increasingly globalized, and understand that the institution of the state has to evolve in new directions if basic liberal and democratic values are to be institutionalized in that environment.

The nationalist Republican conservatives, on the other hand, want to strengthen the state. They are hemmed in by a patriotic and exceptionalist view of its role. Insofar as they are motivated by liberal impulses at all – and of course many parts of their political base are not – it is based on a conception of freedom situated entirely on national-level institutions. As such, it implies walling the world off or, worse, dominating the world as a pre-eminent nation-state. The rise of Trump and the ease with which he took over the Republican Party ought to be a signal to the real libertarians that the Republican Party is no longer viable as a lesser-of-two-evils home for true liberals. The base of the Republican Party, the coalition of constituencies and worldviews of which it is composed, is splitting into two camps with irreconcilable differences over fundamental issues. Good riddance to the nationalists, I say. This split poses a tremendous opportunity for libertarians to finally free themselves of the social conservatism, nationalistic militarists, nativists and theocrats that have dragged them down in the GOP.

]]>
https://techliberation.com/2016/08/11/the-politics-of-the-icann-transition/feed/ 1 76065
Get Real(ist): Don’t confuse NSA regulation with Internet regulation https://techliberation.com/2013/10/27/get-realist-dont-confuse-nsa-regulation-with-internet-regulation/ https://techliberation.com/2013/10/27/get-realist-dont-confuse-nsa-regulation-with-internet-regulation/#comments Sun, 27 Oct 2013 15:26:09 +0000 http://techliberation.com/?p=73733

In her UN General Assembly speech denouncing NSA surveillance, Brazil’s President Dilma Rousseff said:

Information and communications technologies cannot be the new battlefield between States. Time is ripe to create the conditions to prevent cyberspace from being used as a weapon of war, through espionage, sabotage, and attacks against systems and infrastructure of other countries. … For this reason, Brazil will present proposals for the establishment of a civilian multilateral framework for the governance and use of the Internet and to ensure the protection of data that travels through the web.

We share her outrage at mass surveillance. We share her opposition to the militarization of the Internet. We share her concern for privacy.

But when President Rousseff proposes to solve these problems by means of a “multilateral framework for the governance and use of the Internet,” she reveals a fundamental flaw in her thinking. It is a flaw shared by many in civil society.

You cannot control militaries, espionage and arms races by “governing the Internet.” Cyberspace is one of many aspects of military competition. Unless one eliminates or dramatically diminishes political and military competition among sovereign states, states will continue to spy, break into things, and engage in conflict when it suits their interests. Cyber conflict is no exception.

Rousseff is mixing apples and oranges. If you want to control militaries and espionage, then regulate arms, militaries and espionage – not “the Internet.”

This confusion is potentially dangerous. If the NSA outrages feed into a call for global Internet governance, and this governance focuses on critical Internet resources and the production and use of Internet-enabled services by civil society and the private sector, as it inevitably will, we are certain to get lots of governance of the Internet, and very little governance of espionage, militaries, and cyber arms.

In other words, Dilma’s “civilian multilateral framework for the governance and use of the Internet” is only going to regulate us – the civilian users and private sector producers of Internet products and services. It will not control the NSA, the Chinese Peoples Liberation Army, the Russian FSB or the British GCHQ.

Realism in international relations theory is based on the view that the international system is anarchic. This does not mean that it is chaotic, but simply that the system is composed of independent states and there is no central authority capable of coercing all of them into following rules. The other key tenet of realism is that the primary goal of states in the international system is their own survival.

It follows that the only way one state can compel another state to do anything is through some form of coercion, such as war, a credible threat of war, or economic sanctions. And the only time states agree to cooperate to set and enforce rules, is when it is in their self-interest to do so. Thus, when sovereign states come together to agree to regulate things internationally, their priorities will always be to:

  • Preserve or enlarge their own power relative to other states; and
  • Ensure that the regulations are designed to bring under control those aspects of civil society and business that might undermine or threaten their power.

Any other benefits, such as privacy for users or freedom of expression, will be secondary concerns. That’s just the way it is in international relations. Asking states to prevent cyberspace from being used as a weapon of war is like asking foxes to guard henhouses.

That’s one reason why it is so essential that these conferences be fully open to non-state actors, and that they not be organized around national representation.

Let’s think twice about linking the NSA reaction too strongly to Internet governance. There is some linkage, of course. The NSA revelations should remind us to be realist in our approach to Internet governance. This means recognizing that all states will approach Internet regulation with their own survival and power uppermost in their agenda; it also means that any single state cannot be trusted as a neutral steward of the global Internet but will inevitably use its position to benefit itself. These implications of the Snowden revelations need to be recognized. But let us not confuse NSA regulation with Internet regulation.

]]>
https://techliberation.com/2013/10/27/get-realist-dont-confuse-nsa-regulation-with-internet-regulation/feed/ 3 73733
WTF? WTPF! The continuing battle over Internet governance principles https://techliberation.com/2013/04/23/wtf-wtpf-the-continuing-battle-over-internet-governance-principles/ https://techliberation.com/2013/04/23/wtf-wtpf-the-continuing-battle-over-internet-governance-principles/#respond Tue, 23 Apr 2013 19:59:01 +0000 http://techliberation.com/?p=44579

Remember all the businesses, internet techies and NGOs who were screaming about an “ITU takeover of the Internet” a year ago? Where are they now? Because this time, we actually need them.

May 14 – 21 is Internet governance week in Geneva. We have declared it so because there will be three events in that week for the global community concerned with global internet governance. From 14-16 May the International Telecommunication Union (ITU) holds its World Telecommunication Policy Forum (WTPF). This year it is devoted to internet policy issues. With the polarizing results of the Dubai World Conference on International Telecommunications (WCIT) still reverberating, the meeting will revisit debates about the role of states in Internet governance. Next, on May 17 and 18, the Graduate Institute of International and Development Studies and the Global Internet Governance Academic Network (GigaNet) will hold an international workshop on The Global Governance of the Internet: Intergovernmentalism, Multi-stakeholderism and Networks. Here, academics and practitioners will engage in what should be a more intellectually substantive debate on modes and principles of global Internet governance.

Last but not least, the UN Internet Governance Forum will hold its semi-annual consultations to prepare the program and agenda for its next meeting in Bali, Indonesia. The IGF consultations are relevant because, to put it bluntly, it is the failure of the IGF to bring governments, the private sector and civil society together in a commonly agreed platform for policy development that is partly responsible for the continued tension between multistakeholder and intergovernmental institutions. Whether the IGF can get its act together and become more relevant is one of the key issues going forward.

Internet Governance Principles

The Dubai WCIT meeting last year grafted an Internet governance principles debate onto negotiations over an old telecommunications treaty that had little to do with the internet. That muddled the debate considerably. This time, we are actually having a debate about Internet governance principles, specifically the role of states and intergovernmental institutions.

In preparation for the WTPF, The ITU’s Secretary-General has released a 38-page report and five “Draft Opinions” on policy. The stated aim of the WTPF report is “to provide a basis for discussion at the Policy Forum…focusing on key issues on which it would be desirable to reach conclusions.” This is what the IGF ought to be doing but was prevented from doing by key stakeholders in the Internet technical and business communities, because they wanted to make sure the IGF could not be used to challenge the status quo.

The ITU SG’s report contains a fairly balanced survey of many internet-related policy controversies. After digesting it, however, it becomes clear that its main purpose is to re-assert and strengthen the role of governments in Internet governance. In particular, it proposes a definition of multi-stakeholderism that reserves to states a ‘sovereign right’ to make ‘public policy for the Internet;’ a definition that relegates the private sector and civil society to secondary, subordinate roles rather than empowering them as equal-status participants in new institutions for Internet governance. In keeping with this philosophy, the discussions at WTPF will be confined to ITU member states and sector members. Ordinary citizens cannot speak, they can only watch.

A flawed debate

What’s troubling about this looming debate is the intellectual weakness of so many of the supposed defenders of internet freedom. The Internet Society, ICANN and the U.S. government have increasingly re-branded Internet freedom as “The Multistakeholder Model” (TMM). So the choice we are given is not between a free Internet and a restricted, censored one, or between centralized, hierarchical internet governance and a more distributed, participatory, open and decentralized governance. No, we are given a choice between the ITU and a status quo that is vaguely defined as TMM. This not only implies that there is a single, well-defined “Multistakeholder Model” (in fact, there is not), but it conflates the results of good governance (freedom, openness, innovation, globalized connectivity, widespread access) with a particular model. It also tends to exempt many of the existing Internet governance institutions from deserved criticism and reform.

The lack of intellectual substance underlying the principles debate was played out with stark clarity in the U.S. two weeks ago, when the U.S. Congress proposed a bill “to Affirm the Policy of the United States Regarding Internet Governance.” The bill originally said

“It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.”

For reasons that we outlined in an earlier blog, the “government control” language was deemed too controversial and the bill was amended to read:

“It is the policy of the United States to preserve and advance the successful multistakeholder model that governs the Internet.”

So the United States has officially refused to endorse freedom from government control as a policy underlying its approach to Internet governance. It does not, apparently, have any principled objection to censorship, state surveillance to facilitate political manipulation of the population, over-regulation, over-taxation, economic protectionism and other destructive forms of governmental intervention. All those things are fine, apparently, as long as we manage to “preserve and advance” multistakeholder governance. What an uninspiring stance!

Why should anyone support TMM if it is devoid of any substantive meaning regarding the role of states and freedom from governmental control? TMM inspires support only if it is presented as a better alternative to a form of governance that is authoritarian, repressive, ineffective and unrepresentative of Internet users’ interests. In other words, we should support TMM only insofar as it contains and limits the power of nation-states to interfere unduly with the use and operation of the Internet, and empowers individuals worldwide to govern themselves. TMM is not an end in itself. In fact, once it is stripped of substantive policy norms, dogmatic support for TMM seems indistinguishable from unqualified support for existing Internet institutions.

As we enter into this crucial debate about principles of Internet governance, we need to have a better understanding of why global Internet governance institutions need to be shielded from national governments. Below we provide some simple bullet points as a guide to the ongoing debate over principles regarding the role of states in Internet governance.

  • The political unit – the polity – for Internet governance should be the transnational community of Internet users and suppliers, not a collection of states.

There is a fundamental, lasting conflict between territorial jurisdiction and the global Internet. There is a fundamental difference between a collection of leaders of national polities and a global polity. Though national governments can provide legitimate and rights-respecting modes of ordering society within their jurisdiction, at the transnational level there is anarchy, a space where the problems of governance are best addressed by new institutions with direct participation and more open channels of communication. National governments are not ‘just another stakeholder’ in a multistakeholder system: they represent a competing, alternative institutional framework.

  • A system of Internet governance based on states is inherently biased toward greater restriction and control of the Internet’s capabilities.

States are by nature oriented toward control. More specifically, they are concerned about maintaining their own control qua sovereign entity in a territory. They will, therefore, act to limit forms of choice and access that provide alternatives to their control of communication and information. In the international arena they will bargain with other states to maintain their security and control in relation to other states. They will not be optimal representatives of the interests of Internet users in freedom, access and openness. Ever.

  • The threats to Internet freedom posed by states are more serious than those posed by private actors.

Get over the stuff about ‘Googledom’ and ‘Facebookistan.’ It’s a cute metaphor but there is really no comparison between sovereigns and these businesses. States have the power to tax and expropriate, they have a monopoly on the use of force, they generate armed conflicts that result in war; they fund and deploy weapons. You do not choose to use their services. However much you might think you are locked in to Google, there is still a huge qualitative difference between your ability to use or not use its services and the choice you have with respect to states. This doesn’t mean that the private sector is perfect nor that there is no need for states to ever order or regulate what private actors do, but it helps to keep your priorities straight.

  • Multi-stakeholderism is not a panacea

Multistakeholderism as an ideology originated as a pragmatic means of opening up intergovernmental organizations (IGOs) to broader representation and participation. As a transitional mechanism for infusing IGOs with more information, expertise and voice, it has worked wonderfully. But it is not a well-defined, ultimate solution to the problem of Internet governance. The organically evolved Internet institutions were not originally conceived as “multistakeholder” but as private sector and contractually based governance. Some forms of Internet governance, such as the IETF, are truly bottom up, based on individualized representation, decentralized and largely voluntary in effect. Others, like ICANN, are highly centralized, largely coercive, and deeply enmeshed with states in a hybrid form of global governance. The virtues (and faults) of one should not be visited upon the other.

]]>
https://techliberation.com/2013/04/23/wtf-wtpf-the-continuing-battle-over-internet-governance-principles/feed/ 0 44579
An Internet ‘free from Government Control’ A worthy principle https://techliberation.com/2013/04/15/an-internet-free-from-government-control-a-worthy-principle/ https://techliberation.com/2013/04/15/an-internet-free-from-government-control-a-worthy-principle/#respond Mon, 15 Apr 2013 13:10:13 +0000 http://techliberation.com/?p=44513

On Wednesday, April 10, a bill “to Affirm the Policy of the United States Regarding Internet Governance” was marked up in the U.S. House of Representatives. The bill is an attempt to put a formal policy statement into statute law. The effective part says simply:

It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.

Yet this attempt to formulate a clear principle and make it legally binding policy has become controversial. This has happened because the bill brings to a head the latent contradictions and elisions that characterize U.S. international Internet policy. In the process it has driven a wedge between what was once a unified front by U.S. Democrats and Republicans against incursions into Internet governance by intergovernmental organizations such as the ITU.

The problem, it seems, is that the Democratic side of the aisle can’t bring itself to say that it is against ‘government control’ per se. Indeed, the bill has forced people linked to the Obama administration to come out and openly admit that ‘government control’ of the internet is OK when we exercise it; it’s just those other countries and international organizations that we need to worry about.

The U.S. has been deeply enmeshed in this contradiction ever since the World Summit on the Information Society in 2003-5, when it fended off criticisms of the U.S.-controlled ICANN while claiming to oppose ‘government control.’ In the meantime various US government agencies have (largely unconscious of or independently of the Internet freedom rhetoric) cast global shadows of hierarchy over various aspects of the Internet, seeking extraterritorial domain name takedowns, ACTA, restricted online gambling, cyber-weapons, and so on.

Until now, the contradiction has remained latent, a sotto voce muttering that the emperor has no clothes. Only a few hyper-critical academics (like us) were willing to articulate the argument, generally irritating everyone in the process. But now it’s out in the open. The double standard is humorously evident in this video showing the testimony of Rep. Eshoo, a Democrat of California, in the markup hearings. Rep. Eshoo says:

“…the expert agencies have expressed concern with the term, quote, ‘government control,’ unquote. One diplomat suggested that the use of his term could actually undermine existing Internet governance institutions such as ICANN because of its, uh, uh, close relationship with, uh, our government. Foreign countries frequently cite the close coordination between ICANN and US Dept of Commerce as an example of US quote ‘control’ over the internet.”

Well, yes, Rep. Eshoo, other countries do look at ICANN as a form of global Internet control exercised by one government. Are they wrong? ICANN gets its policy making authority over the DNS root directly from a contract with the U.S. government, and in exchange for receiving that contract ICANN has to stay in the U.S. and conform to various policies. This is not ‘close coordination;’ it’s control. Not even the slipperiest politician can plausibly deny this.

A similar double standard was raised in the response of Public Knowledge (PK), a U.S. public interest group. PK happily collected grants to join the U.S.-led charge against ‘government control of the Internet’ in the renegotiation of the ITU’s International Telecommunication Regulations. It joined in the anti-government rhetoric about how the Internet had to be left alone. Now it wants to clarify its position a bit:

we fear that the broad language of the proposed bill may intrude on areas of consumer protection, competition policy, law enforcement and cybersecurity long considered appropriate for national policy formulated by governments with input from civil society, business and the technical community.

Like Rep. Eshoo, PK is forced to distinguish between government control at home (the good kind) and government control that involves the rest of the world (the scary kind). Note that PK also tacitly accepts the description of different roles for government and civil society that the authoritarian states put into the WSIS Tunis Agenda: governments formulate policy and the rest of us just provide input.

Remember, at the end of the WCIT negotiations we were being told that an indirect reference to spam (“unsolicited bulk electronic communications”) in the ITRs opened the door to systematic content regulation on a global basis. Now PK is forced to admit that:

Although we opposed the ITU resolution to require countries to limit spam, the United States protects its citizens from spam through the CAN-SPAM Act.

Indeed. And why are domestic spam laws fine and international ones (that would have to be enforced by and consistent with those same domestic laws, and ratified by the same national legislature that passed the domestic laws) a threat to the very basis of free expression? According to PK,

Our opposition to ceding authority to the ITU to decide how to balance consumer protection and free expression is not because we see no role for government in protecting consumers or promoting competition. Rather, we believe those matters are best decided here at home, by a Congress accountable to the people and enforced by a government constrained by the Constitution.

So has PK gone cyber-nationalist? Like the Chinese, the Russians, the Saudis and the Iranians, does it want a balkanized Internet governed by a separate and distinct series of national sovereigns? If so, what, exactly, is wrong with the ITU as a venue for negotiating governance? The ITU is a global governance institution founded on the principles of national sovereignty.

We think its high time to call the bluff of American politicians and advocacy groups that play with this double standard. If they cannot bring themselves to embrace a principle of “a global Internet free from government control” it’s time to ask them what they do stand for.

Defending the legitimate rights of consumers to be protected against fraud or monopolies is not “government control” of the Internet, by any serious definition. By protecting individual rights to privacy, by challenging coercive and collusive monopolies and by prosecuting fraud, governments are maintaining individual freedom, not exerting control. It is worrisome, therefore, that allegedly liberal groups such as PK want to maintain an option for ‘government control’ at the level of broad principle.

The PK’s reversion to cybernationalism is both intellectually flawed and politically disturbing. Their attempt to distinguish between national laws and international ones falls apart completely when examined. Laws that overreach and over-regulate occur in both levels; PK simultaneously underestimates the dangers of government control at home (which is odd, given its involvement in issues such as CISPA) and overstates the dangers of international laws (which typically have to be ratified domestically and are subject to reservations).

Whether you are talking about China, Russia or the USA, you can’t have a free Internet and a national Internet. As a virtual space constructed out of a globally interconnected infrastructure, cyberspace realizes its highest potential when it is not artificially bounded by jurisdiction or hierarchically imposed filters. Right now, the biggest threats to internet freedom are from national governments. And while there are indeed aspects of communications that can and should be left to domestic regulation, any regulation that is too scary to be implemented at the international level probably poses many of the same dangers when enacted at the national level. The idea that we only have to worry about ‘government control’ when we are talking about foreign governments is obviously wrong.

The House bill articulates a worthy principle that can be and should be globally applicable to the Internet. Not controlling the Internet does not mean that there is no role for laws or regulations that safeguard individual rights; it means that national governments should recognize the Internet’s transnational nature and refrain from trying to suppress the rights to free expression and free association that have emerged in the context of a decentralized Internet not under the control of any sovereign.

]]>
https://techliberation.com/2013/04/15/an-internet-free-from-government-control-a-worthy-principle/feed/ 0 44513
How ARIN and U.S. Commerce Department were duped by the ITU https://techliberation.com/2013/03/29/how-arin-and-u-s-commerce-department-were-duped-by-the-itu/ https://techliberation.com/2013/03/29/how-arin-and-u-s-commerce-department-were-duped-by-the-itu/#comments Fri, 29 Mar 2013 12:37:34 +0000 http://techliberation.com/?p=44385

ARIN is the Internet numbers registry for the North American region. It likes to present itself as a paragon of multistakeholder governance and a staunch opponent of the International Telecommunication Union’s encroachments into Internet governance. Surely, if anyone wants to keep the ITU out of Internet addressing and routing policy, it would be ARIN. And conversely, in past years the ITU has sought to carve away some of the authority over IP addressing from ARIN and other RIRs.

But wait, what is this? March 15 the ITU Secretary-General released a preparatory report for the ITU’s World Telecommunications Policy Forum, which will take place in Geneva May 14-16. The report contains 6 Internet-related policy resolutions “to provide a basis for discussion …focusing on key issues on which it would be desirable to reach conclusions.” Draft Opinion #3 pertains to Internet addressing. Among other things, the draft resolves:

  • “that needs-based address allocation should continue to underpin IP address allocation, irrespective of whether they are IPv6 or IPv4, and in the case of IPv4, irrespective of whether they are legacy or allocated address space;
  • “that all IPv4 transactions be reported to the relevant RIRs, including transactions of legacy addresses that are not necessarily subject to the policies of the RIRs regarding transfers, as supported by the policies developed by the RIR communities;”
  • “that policies of inter-RIR transfer across all RIRs should ensure that such transfers are needs based and be common to all RIRs irrespective of the address space concerned.”

These policy positions thrust the ITU and its intergovernmental machinery directly into the realm of IP addressing policy. But that is quite predictable; the ITU has always wanted to do that. What is unusual about these resolutions is that they bear an uncanny resemblance to the policy positions currently advocated by ARIN and the U.S. Department of Commerce.

In other words, far from challenging the authority of the RIRs, as it used to do, the ITU now seems to be supinely issuing policy positions that reflect the interests of the RIRs. And after checking with sources who were at the meetings where these draft opinions were created, I confirmed that it was indeed ARIN staff, other RIRs and U.S. Commerce Department representatives who pushed for these positions. Indeed, some sources complained that the whole discussion was completely dominated by RIRs and the U.S.; hardly anyone else was participating.

This is a rather significant turn of events. If nothing else, it makes you think twice about the claims coming out of Dubai that the Internet’s organic multistakeholder institutions were locked in a to-the-death struggle with the forces of repression and authoritarianism in the ITU.

Why did this happen?

As we have noted in earlier blogs, ARIN’s staff and board cling to needs-based address allocations because it gives them control, and they want to retain policy authority over legacy address block holders – because it gives them control. Yet its authority over legacy holders is questionable, to say the least. Legacy block holders not only have no contract with ARIN, they received their number blocks before ARIN existed. Many of them would like to be able to sell numbers to any buyer, regardless of ARIN approvals or needs assessments. ARIN’s current leadership just can’t bring itself to accept this.

Apparently, ARIN is so desperate to validate its shaky claim of authority over legacy address space that it will go to any lengths to find support for it – including inserting its policy preferences into an ITU resolution.

What the geniuses at Commerce and ARIN do not seem to understand is that by getting ITU to be its sock puppet, they are also legitimizing the notion that the ITU and its collection of governments have a legitimate role to play in making and enforcing IP address policy. And yet there is a nice bargain here: ARIN uses the ITU process to validate its position; ITU validates it process by having it used by ARIN.

It is clear that the ITU no longer cares much what the substantive policy is, it just wants to be recognized as a platform for global internet policy. Indeed, it is ironic that just as the more enlightened sections of the Internet technical community are starting to question or openly reject needs assessment, the ITU is just starting to embrace it. Insert your favorite joke about regulatory dinosaurs here: by the time the ITU starts endorsing the conventional wisdom, it’s probably no longer wisdom.

]]>
https://techliberation.com/2013/03/29/how-arin-and-u-s-commerce-department-were-duped-by-the-itu/feed/ 4 44385
Regulating the Market for Zero-day Exploits: Look to the demand side https://techliberation.com/2013/03/15/regulating-the-market-for-zero-day-exploits-look-to-the-demand-side/ https://techliberation.com/2013/03/15/regulating-the-market-for-zero-day-exploits-look-to-the-demand-side/#respond Fri, 15 Mar 2013 20:04:34 +0000 http://techliberation.com/?p=44083

A market has developed in which specialized firms discover new vulnerabilities in software and sell that knowledge for tens or hundreds of thousands of dollars. These vulnerabilities are known as “zero day exploits” because there is no advance knowledge of them before they are used. In this blog post, we recognize that this market may require some kind of action, but reject simplistic calls for “regulation” of suppliers. We recommend focusing on the demand side of the market.

Although there is surprisingly little hard evidence of its scope and scale, the market for vulnerabilities is considered troublesome or dangerous by many. While the bounties paid may stimulate additional research into security, it is the exclusive and secret possession of this knowledge by a single buyer that raises concerns. It is clear that when a someone other than the software vendor pays $100,000 for a zero-day they are probably not paying for defense, but rather for an opportunity to take advantage of someone else’s vulnerability. Thus, the vulnerabilities remain unpatched. (Secrecy also makes the market rather inefficient; it may be possible to sell the same “secret” to several buyers.)

The supply side of the market consists of small firms and individuals with specialized knowledge. They compete to be the first to identify new vulnerabilities in software or information systems and then bring them to buyers. Many buyers are reputed to be government intelligence, law enforcement or military agencies using tax dollars to finance purchases. But we know less about the demand side than we should. The point, however, is that buyers are empowered to initiate an attack, a power that even legitimate organizations could easily abuse.

Insofar as the market for exploits shifts incentives away from publicizing and fixing vulnerabilities toward competitive efforts to gain private, exclusive knowledge of them so they can be held in reserve for possible use, the market has important implications for global security. It puts a premium on dangerous vulnerabilities, and thus may put the social and economic benefits of the Internet at risk. While the US might think it has an advantage in this competition, as a leader in the Internet economy and one of the most cyber-dependent countries, it also has the most to lose.

Unfortunately, so far the only policy response proposed has been vague calls for “regulation.” Chris Soghoian in particular has made “regulation” the basis of his response, calling suppliers “modern-day merchants of death” and claiming that “Security researchers should not be selling zero-days to middle man firms…These firms are cowboys and if we do nothing to stop them, they will drag the entire security industry into a world of pain.”

Such responses, however, are too long on moral outrage and too short on hard-headed analysis and practical proposals. The idea that “regulation” can solve the problem overlooks major constraints:

  • The market is transnational and thus regulation of supply would require agreement among contending nation-states. National security interests are implicated, making agreement among states difficult.
  • Disclosure and enforcement would be challenging. Unlike physical weapons systems, exploits are invisible and traded digitally. Buyers and sellers have strong incentives not to disclose deals. James Lewis of CSIS, who worked on a project to restrict access to or exports of software claims it “was impossible to control – there were so many ways to beat any restrictions, so many people who could write the code.”
  • The line between legitimate security services/research and the market for zero-day exploits is thin and blurry. Regulating exploit supply may translate into regulating all security software development, which would be costly and economically stifling;
  • It would be relatively easy for this type of market to go underground if regulation chafed. Governments could bring such R&D in-house instead of using an external market. Sales to terrorist or criminal groups are unlikely to be affected by any national or international system of regulation.

Despite these constraints, we do need to seriously consider ways to redirect incentives away from the discovery and possible exploitation of vulnerabilities towards discovering, publicizing and fixing them for the public benefit.

We suggest focusing policy responses on the demand side rather than the supply side. The zero-day market is largely a product of buyers, with sellers responding to that demand. And if it is true that much of the demand comes from the US Government itself, we should have a civilian agency such as DHS compile information about the scope and scale of our participation in the exploits market. We should also ask friendly nations to assess and quantify their own efforts as buyers, and share information about the scope of their purchases with us. If U.S. agencies and allies are key drivers of this market, we may have the leverage we need to bring the situation under control.

One idea that should be explored is a new federal program to purchase zero-day exploits at remunerative prices and then publicly disclose the vulnerabilities (using ‘responsible disclosure’ procedures that permit directly affected parties to patch them first). The program could systematically assess the nature and danger of the vulnerability and pay commensurate prices. It would need to be coupled with strong laws barring all government agencies – including military and intelligence agencies – from failing to disclose exploits with the potential to undermine the security of public infrastructure. If other, friendly governments joined the program, the costs could be shared along with the information.

In other words, instead of engaging in a futile effort to suppress the market, the US would attempt to create a near-monopsony that would pre-empt it and steer it toward beneficial ends. Funds for this purchase-to-disclose program could replace current funding for exploit purchases.

Obviously, terrorists, criminals or hostile states bent on destruction or break-ins would not be turned away from developing zero-days by the prospect of getting well-paid for their exploits. But most of the known supply side of the market does not seem to be composed of terrorists or criminals, but rather profit-motivated security specialists. And it’s likely that legitimate, well-paid talent will discover more flaws than “the dark side” in the long run.

Obviously the details regarding the design, procedures and oversight of this program would need to be developed. But on its face, a demand-side approach seems much more promising than railing against the morality of so-called cyber arms dealers.

]]>
https://techliberation.com/2013/03/15/regulating-the-market-for-zero-day-exploits-look-to-the-demand-side/feed/ 0 44083
Freedom to innovate and new top level domains https://techliberation.com/2013/03/08/freedom-to-innovate-and-new-top-level-domains/ https://techliberation.com/2013/03/08/freedom-to-innovate-and-new-top-level-domains/#respond Fri, 08 Mar 2013 20:44:26 +0000 http://techliberation.com/?p=44011

There are hundreds of applications for generic words in ICANN’s new top level domain program. They include .BOOK, .MUSIC, .CLOUD, .ACCOUNTANT, .ARAB and .ART. Some of the applicants for these domains have chosen to make direct use of the name space under the TLD for their own sites rather than offering them for broad general use. Amazon, for example, would probably make .BOOK an extension of its online bookstore rather than part of a large-scale domain name registration business; Google would probably make .CLOUD an extension of its own cloud computing enterprises.

This is really no different from Barnes and Noble registering BOOK.COM and using it only for its bookstore, Scripps registering FOOD.COM and controlling the content of the site, or CNET registering NEWS.COM and making exclusive use of the site for its own news and advertising. Nor is it terribly different from the .MUSEUM top level domain.

Yet these proposals have generated a loud chorus of objections from competing businesses. They have dubbed these applications ‘closed generics’ and shouted so loud that ICANN is once again considering changing its policies in mid-implementation. ICANN staff has called for public comment and asked specifically whether it should change its rules to determine what is a ‘generic term’ and whether ICANN should enlarge even further its role as as a top-down regulator and dictate whether certain business models can be associated with certain domain names.

A group of Noncommercial Stakeholders Group (NCSG) members have weighed in with some badly-needed disinterested public comment. It isn’t about ‘open’ or ‘closed,’ they maintain, it is about the freedom to innovate.

As NCSG stakeholders, our position is driven neither by paying clients nor by an interest in the success of specific applications. It is based on a principled commitment to the ‘permissionless innovation’ that has made the Internet a source of creativity and growth. Our aim is to maximize the options available to DNS users and to minimize top-down controls. We support the freedom of individuals and organizations to register domains and use them legally in any way they see fit. We support experimentation with new ideas about what a TLD can do. We see no reason to impose ex ante restrictions on specific business models or methods of managing the name space under a TLD.

The group warns ICANN of the danger of giving itself the power to decide what qualifies as a ‘generic word’ and rejects any attempt to retroactively create new policies that would dictate business models for TLD applicants. Hopefully ICANN’s board will be able to look past the self-interested cries of businesses that want to eliminate competitors and consider the public interest in Internet freedom. The comments and list of supporters are available at this link.

]]>
https://techliberation.com/2013/03/08/freedom-to-innovate-and-new-top-level-domains/feed/ 0 44011
More disclosure needed: Why did the US government insist on retaining the International Telecom Regs? https://techliberation.com/2012/06/08/more-disclosure-needed-why-did-the-us-government-insist-on-retaining-the-international-telecom-regs/ https://techliberation.com/2012/06/08/more-disclosure-needed-why-did-the-us-government-insist-on-retaining-the-international-telecom-regs/#comments Fri, 08 Jun 2012 15:21:21 +0000 http://techliberation.com/?p=41370

In another blog post, I put the International Telecommunication Union’s WCIT into perspective. I ended that discussion with a question that no one else seems to be asking: should there be International Telecommunication Regulations (ITRs) at all? Why do we need them?

I don’t think we do need sector-specific international regulations. I think they can cause more trouble than benefit. To briefly explain why, I noted that every country has its own national regulations regarding interconnection, privacy, antitrust, consumer protection, and so on. Compatibility across platforms and services is much easier technically than it was in the 1930s and before, and tends to get worked out in the market through a variety of bridging technologies and nongovernmental standards forums. International telecommunications is a form of trade in services, and the WTO agreements already provide a sufficient regulatory basis for foreign or multinational providers to enter national markets and offer transnational services. Though not all countries are members of WTO, membership can be expanded and bilateral or regional agreements can supplement it.

Imagine my surprise when someone informed me that the Europeans were calling for the abrogation of the ITRs for exactly those reasons. Apparently they defended that position for years.  But the European drive to get rid of the ITRs was opposed and eventually blocked by — wait for it — the United States of America! The US, I am told, argued that the existing treaty was essential because most of the world’s international communications were regulated by it.

That puts a dramatically new spin on the US’s current campaign to fend off an ITU “takeover” of the Internet. If revision of the ITRs are such a threat to the Internet, why did the US insist on retaining them? If the ITRs are retained, it is inevitable that they would have to be updated and revised. and yet now, the US government is warning us that the revision process poses a major threat to the independence and freedom of the Internet. Something is wrong with this picture.

Most of my information about this is second-hand, from sources that want to remain off the record. But there is proof that the US has defended the importance of the ITRs in an ITU list of documents that can be viewed here. There, in a depository of an ITU expert group that was preparing the grounds for the WCIT, one finds a document submitted by the US entitled the “Continued Critical Role of the ITRs.” Now if you click on the link that I have mischievously placed to that document, you will be taken to a closed, login-required page; before you can read that document, you have to be a TIES member. In other words, this is yet another example of the closed nature of the ITU process. There is another set of papers here that would be of interest in understanding why we even have the ITRs. But they, too, are locked inside TIES.

And that means, this is a job for WCITleaks! The U.S. government should release this document, and if it doesn’t, inside whistleblowers and other people with access to a TIES account need to leak it to us.

]]>
https://techliberation.com/2012/06/08/more-disclosure-needed-why-did-the-us-government-insist-on-retaining-the-international-telecom-regs/feed/ 2 41370
ICANN must not back down https://techliberation.com/2011/12/29/icann-must-not-back-down/ https://techliberation.com/2011/12/29/icann-must-not-back-down/#comments Thu, 29 Dec 2011 18:18:35 +0000 http://techliberation.com/?p=39621

ICANN’s plan to open up the domain name space to new top level domains is scheduled to begin January 12, 2012. This long overdue implementation is the result of an open process that began in 2006. It would, in fact, be more realistic to say that the decision has been in the works 15 years; i.e., since early 1997. That is when demand for new top-level domain names, and the need for other policy decisions regarding the coordination of the domain name system, made it clear that a new institutional framework had to be created. ICANN was the progressive and innovative U.S. response to that need. It was created to become a nongovernmental, independent, truly global and representative policy development authority.

The result has been far from perfect, but human institutions never are. Over the past 15 years, every stakeholder with a serious interest in the issue of top level domains has had multiple opportunities to make their voice heard and to shape the policy. The resulting new gTLD policy reflects that diversity and complexity. From our point of view, it is too regulatory, too costly, and makes too many concessions to content regulators and trademark holders. But it will only get worse with delay. The existing compromise output that came out of the process paves the way for movement forward after a long period of artificial scarcity, opening up new business opportunities.

Now there is a cynical, illegitimate last-second push by a few corporate interests in the United States to derail that process.

The arguments put forward by these interests are not new; they are the same anti-new TLD arguments that have been made since 1997 and the concerns expressed are all addressed in one way or another by the policies ICANN has developed. Their only new claim is that they have the ear of powerful people in the United States government, including Senator Jay Rockefeller.

In effect, U.S. corporate trademark interests are openly admitting that their participation in the ICANN process has been in bad faith all along. Despite the multiple concessions and numerous re-dos that these interests managed to extract over the past 6 years, they are now demanding that everything grind to a halt because they didn’t get exactly what they demanded — as if no other interests and concerns mattered and no other stakeholders exist. What they wanted, in fact, was simply to freeze the status quo of 1996 into place forever, so that there would be no new competition, no new entrepreneurial opportunities, no linguistic diversification, nothing that would have the potential to cause them any problems.

That group’s demands must be rebuffed, unambiguously and finally. ICANN must start implementing the new TLD program on January 12 as scheduled. ICANN must keep its promise to those who participated in its processes in good faith.

To its everlasting credit, the U.S. Commerce Department, the official governmental contractor and supervisor of ICANN, has not caved in to the cynical corporate obstructionism. They realize what is at stake. Assistant Secretary of Commerce Lawrence Strickling is responsible and intelligent enough to understand what an unmitigated disaster it would be to pull the plug on 15 years of work.

The stakes here go well beyond the merits or de-merits of new top level domains. Any move to delay or pull back on the start date of the new TLD program is an admission that ICANN does not really make the basic policy decisions regarding the global domain name system. It is an admission that ICANN itself is a failure. That throws us straight back to 1997, re-opening all the instability and turmoil that we have tried to resolve by the creation of a new global governance institution.

If ICANN blinks, if it deviates from or delays its agreed and hard-fought policy in the slightest way, the coup d’etat succeeds. Everyone in the world then concludes that a few corporate interests in the United States hold veto power over the policies of the Internet’s domain name system. Imagine the centrifugal forces that are unleashed as a result. Imagine the impact in Russia, China, Brazil, India, South Africa, and even the EU, when they are told in no uncertain terms that ICANN’s policy making is hostage to the whims of a few well-placed, narrowly focused U.S. business interests; that they can invest thousands of person-hours and resources to working in that framework only to see the rug pulled out from under them by a campaign by the ANA and an editorial by the New York Times. The entire institutional infrastructure we have spent 15 years trying will be drained of its life.

Of course no one is perfectly happy with the new TLD program. But no one should assume that they can have exactly what they want, and that the whole process should be stopped until they get it. Any modification to the DNS involves millions of stakeholders and dozens of conflicting interest groups. Without question, we have reached the point where satisfying one group more, will satisfy other groups less. The idea that the basic conflicts of interest at the heart of this controversy will magically vanish if we delay things is worse than naïve, it is socially pathological. To delay now is to give one, very narrow interest group exactly what it wants (no new TLDs) and everyone else nothing. That is a far worse solution than ICANN’s flawed but workable program.

On January 12, ICANN needs to make a statement that it is going forward and strongly reaffirm its ability to deliver on the promise of a nongovernmental, multistakeholder, global governance institution. It should celebrate the opening of its application. It should throw a big party. In this case, it has both the authority and the legitimacy it needs to proceed.

]]>
https://techliberation.com/2011/12/29/icann-must-not-back-down/feed/ 5 39621
Trading IPv4 Addresses Starts Making Internet Elders Nervous https://techliberation.com/2011/08/15/trading-ipv4-addresses-starts-making-internet-elders-nervous/ https://techliberation.com/2011/08/15/trading-ipv4-addresses-starts-making-internet-elders-nervous/#comments Mon, 15 Aug 2011 18:23:38 +0000 http://techliberation.com/?p=38068

Paul Vixie, a renowned Internet pioneer who runs the Internet Systems Consortium, has written an article in ACM Queue attacking “those who would unilaterally supplant or redraw the existing Internet resource governance or allocation systems.” The publication of this article is a sign of a growing, important debate around the reform of IP address registries in the age of IPv4 exhaustion.

Vixie defends the Regional Internet Registries monopoly on IP address registration services and its current, needs-based policies toward address transfers. I am sure that Paul sincerely believes in the arguments he makes, but it’s also true that Vixie is the chairman of the Board of The American Registry for Internet Numbers (ARIN), the regional address registry for North America. When Vixie argues that ARIN’s exclusive control over Whois and address transfer services is beneficial and “in the physics” he is also defending the authority and revenue model of his own organization against a perceived threat.

And that takes us to another relevant fact. The argument Vixie makes is cast in generalities, but he is really attacking a specific firm, a holding company known as Denuo. Denuo has formed both a secondary marketplace called Addrex for the legitimate trading of IPv4 number blocks, and an IP Address Registrar company known as Depository. Let’s set aside Depository for the moment (I will come back to it) and concentrate on Addrex, which has become the first end-to-end platform for legacy address holders to sell their IPv4 number blocks. Famously, Addrex scored a major success as the intermediary for the Nortel-Microsoft trade. But Nortel-Microsoft was unusually visible because it had to go through bankruptcy court. Is anything else happening? I spoke to Addrex’s President Charles Lee since then to find out. “We are very busy signing up a growing number of global corporate and governmental customers to sell their unused assets,” he said. I asked him what the buyer side of the marketplace was beginning to look like and he said “Our value proposition to large Asian network operators has resonated quite effectively and we expect to enter into many agreements with them over the coming months.” Surely Vixie and the ARIN Board have gotten wind of this. So when Vixie begins a public attack on this company and its business model, he is signaling to the rest of us that ARIN is worried.

It should be. Vixie’s article is premised on a stretched, invalid comparison of Denuo to the proponents of competing DNS roots. But alt.root proponents were almost always lightweight rebel-operators who could never, as Vixie correctly points out, make a stable, serious value proposition to end users. Even New.net, the alt.root blessed with millions in venture capital, never came close to success, because there was just not much value in offering someone registration under a top level domain that only one tenth or less of the Internet could see. (To salvage its unviable business model Net.net was pushed into a number of questionable tactics.)

Denuo/Addrex is most emphatically not like that. They have a simple, valuable business proposition to make to buyers and sellers of legacy address blocks: entities with surplus addresses can profit from them, entities who need more can get them. Businesses who use their services do not cut themselves off from part of the Internet – on the contrary, they help to maintain and secure their connectivity. And they do it without subjecting themselves to the uncertainty and bureaucracy of an ARIN “needs assessment.” Moreover, although Denuo will no doubt benefit handsomely by being the first mover in this market, its success in no way depends on being the only person who ever pursues this business model. There is no reason why there could not be as many address brokers as there are real estate brokers.

But Vixie says this will all end in tears. Shorn of hype and rhetoric, Vixie’s argument amounts to this: if people are allowed to trade addresses outside of ARIN’s control, then we cannot have an accurate registration database that tells us who holds which address block.

That argument is obviously wrong. It would be a simple matter for ARIN (and other RIRs) to set up procedures to recognize and record transactions conducted by external parties – if they were willing to surrender pointless “needs assessments.” We have thousands of stock brokers and dozens of exchanges independently trading shares, but somehow the world manages to keep track of who does and does not own a specific quantity of critically important stock certificates. So ARIN does not need to control, mediate and set policy for all IP address transfers in order to maintain an accurate database of who holds which address block. It simply needs to serve as a title agency where people come to record their transactions. It is in the self-interest of both sides of an IP address transaction to ensure that their transfer of rights was recorded and is published in a common, authoritative global Whois database. Think of your local county or village property recorder. They do not insist on being a real estate brokerage in their territory – much less the only brokerage. There are many parties who can independently engage in property transactions and then bring them to the property registry for verification and recording.

Vixie is not appealing to the “physics” of registration and database economics – indeed, Vixie’s belief that industrial organization is explained by “physics” rather than economics is a bit odd. What’s really happening here is that Vixie and others at ARIN realize that the only way to shield themselves against losing control over and revenue from the IPv4 address market is to leverage their monopoly over the IP address registration database. What they would like to do is discourage competing and independent address trading and registration firms by refusing to make changes in their Whois database to reflect independent transactions. In antitrust/economics terms, they are tying registration records, which do need to be integrated and uniform, to another potentially competitive service, address brokerage and post-allocation registration services, so that it can continue to enjoy a monopoly in both. The penalty for trading addresses outside of its transfer regime is to be banished from or ignored by ARIN’s Whois database.

This is a risky strategy, and the community that ARIN claims to represent should question it. There are powerful economic incentives for legacy holders and companies that need more addresses to trade. Those trades are going to happen. There are also strong economic incentives for those holding legacy resources to avoid contracting with ARIN and other RIRs. Those companies could choose to sign up with Depository and other companies like it. If ARIN refuses to record them because it doesn’t approve of the company or is not in control of the transaction, ARIN will be the one responsible for fragmenting and corrupting the IP address registration database.

Vixie’s argument about Whois fragmentation is doubly wrong when one realizes that the Whois database ARIN runs is actually a mess when it comes to pre-1997 address blocks (slightly over 40% of the available space). There are hundreds of firms that no longer exist listed there, and thousands of registration records that have not changed in fifteen years. I reached out to Charles Lee of Addrex about Vixie’s article to get his comments on it. He first issued a disclaimer that “generally, we…do not engage in public debates about Internet governance. We prefer to focus on our clients and serving their needs.” But he felt that Vixie’s posting contained “misleading inaccuracies” that required some comment.

“The present Whois system which Mr. Vixie defends with such verve absolutely cannot be relied upon. That is not a personal position or opinion. It is a fact. As an illustrative example let’s use the now famous Nortel/Microsoft deal which was brokered by Addrex Inc. In that transaction not one of the 38 number blocks transferred from Nortel Networks Inc. to Microsoft Inc. were appropriately listed in any Whois database in the name of Nortel Networks Inc. It took hundreds of man-hours and over two thousand pages of documentation to reconcile the fiction of the Whois system entries to the reality of Nortel Networks. Unfortunately that now famous case is not the exception to the rule but rather the reality of the situation. For Mr. Vixie to even suggest that Network operators could, or should, rely on an identification system rife with such gross errors is unconscionable.”

It’s likely that private, commercially motivated firms will do a far better job of cleaning up the Whois records than ARIN will on its own. Denuo’s Depository, for example, is already advertising its ability to research and verify “chain of custody” of address holdings. Depository has advocated a “registry-registrar split,” analogous to the one in Domain Name Service, to allow competing firms to provide post-allocation registry services while maintaining the RIRs as a single, accurate registration database. Vixie’s argument that competition from Denuo assumes that no “copycat” competition will ever emerge is 100% wrong. The structural reforms Denuo has advocated are actually designed to pave the way for more competition in that field hence their filing with ICANN to create an accreditation policy for all IP address registrars.

To conclude, Internet elders such as Vixie have done a wonderful job writing software and defending the Internet’s standards and architecture against various threats. For this, we can be respectful and grateful. But expertise in technology does not necessarily translate into expertise in economics and public policy. Vixie’s polemic in ACM queue shows us that expertise in one domain often becomes blind arrogance in another. Only someone determined to preserve ARIN’s monopoly would ever make the claims he makes.

]]>
https://techliberation.com/2011/08/15/trading-ipv4-addresses-starts-making-internet-elders-nervous/feed/ 11 38068
DC event on Internet governance https://techliberation.com/2011/04/20/dc-event-on-internet-governance/ https://techliberation.com/2011/04/20/dc-event-on-internet-governance/#respond Wed, 20 Apr 2011 21:35:02 +0000 http://techliberation.com/?p=36337

“Global Internet Governance: Research and Public Policy Challenges for the Next Decade” is the title for a conference event held May 5 and 6 at the American University School of International Service in Washington. See the full program here.

Featured will be a keynote by the NTIA head, Assistant Secretary for Commerce Lawrence Strickling. TLF-ers may be especially interested in the panel on the market for IP version 4 addresses that is emerging as the Regional Internet Registries and ICANN have depleted their free pool of IP addresses. The panel “Scarcity in IPv4 addresses” will feature representatives of the American Registry for Internet Numbers (ARIN) and Addrex/Depository, Inc., the new company that brokered the deal between Nortel and Microsoft. There will also be debates about Wikileaks and the future of the Internet Governance Forum. Academic research papers on ICANN’s Affirmation of Commitments, the role of the national governments in ICANN, the role of social media in the Middle East/North Africa revolutions, and other topics will be presented on the second day. The event was put together by the Global Internet Governance Academic Network (GigaNet). Attendance is free of charge but you are asked to register in advance.

]]>
https://techliberation.com/2011/04/20/dc-event-on-internet-governance/feed/ 0 36337
Why I fear the AT&T-T-Mobile merger https://techliberation.com/2011/04/18/why-i-fear-the-att-t-mobile-merger/ https://techliberation.com/2011/04/18/why-i-fear-the-att-t-mobile-merger/#comments Mon, 18 Apr 2011 21:14:01 +0000 http://techliberation.com/?p=36335

I was surprised to read a defense of the AT&T-T-Mobile merger here.

Let’s begin at the beginning and ask why this merger is happening. It’s not as if AT&T is gaining dominance the way Google gained it in search and advertising, or the way Intel did in chips: i.e., through low prices, superior products and customer loyalty. No, last time I looked AT&T was the carrier with the lowest customer satisfaction ratings, some of the highest prices and one of the weakest network performance metrics. In my opinion there is no reason for this merger to take place other than to make life easier for AT&T by reducing competitive pressures on it. AT&T seems to be driven by the following calculus. It can either grow its services and its network under the harsh constraints of market pricing and competition, or it can attempt to reduce the field to an oligopoly with tacit price controls by using its size and financial bulk to eliminate a pest who keeps downward pressure on pricing and service requirements. I think it is rational for AT&T to try to get away with the latter. I think it’s insane for free market oriented thinkers to support it.

Larry Downes can’t argue with the extremely high level of market concentration and the scary HHI measurements that the merger would produce. So he plays the game that clever antitrust advocates always play: shift the market definition. Downes argues that “both Justice and the FCC have consistently concluded that wireless markets are essentially local.” I see no citation to any specific document in Downe’s claim, but if FTC and FCC have concluded that “local” means “my metropolitan area” they are wrong.

Let’s reacquaint everyone with a very basic but pertinent fact: 93% of the wireless users in the U.S. are served by the national carriers. This number (the proportion served by national as opposed to regional providers) has generally increased over the past decade, driven by both demand-side requirements, mergers, and supply-side efficiencies. The choices of consumers have rendered a decisive verdict negating Downes’s claim. Whether it’s voice or data, people expect and want seamless national service; a small but significant segment wants transnational compatibility as well.

Increases in the scope of service will intensify as we move from a primarily voice-driven market to a data-driven market. Carriers who have to impose roaming charges and interconnection fees on their users will not be competitive. Nor will they be able to attract the interest of the cutting-edge handset manufacturers and service developers. Can you imagine Apple signing an iPhone exclusivity deal with Cricket?

It is no accident that the dominant mobile network operators have national brands and national footprints. Most Americans travel outside their metro areas at least once a month, and go places further away than that at least once or twice a year. The 93% who choose a national carrier are rationally calculating that it pays to not have to guess the service area limits of their provider. Of course, a highly budget-constrained segment of the market will accept limited local service for a lower price. To say that those smaller providers are in the same market as a T-Mobile or AT&T is not plausible. They occupy a niche. And if one allows a major merger like this on the grounds that these tiny players constitute a competitive alternative to the likes of AT&T, what are you going to say as the last of these local providers is gobbled up?

How about that “spectrum efficiency” argument? Downes, like the AT&T Corp., makes the same claim that the old AT&T made when it said there should be no microwave-based competition in long distance. As a matter of pure engineering efficiency, it is of course true that a single, optimizing planner can make better use of limited spectrum bands than multiple, competitive providers. But then, that argument applies to any and all carriers (an AT&T-Verizon merger, for example) and to any resource – that’s why it was used by the socialists of the 19th century to claim that capitalism was inherently wasteful and inefficient. Dynamic efficiencies of competition typically benefit the public more than a few allocative efficiencies. And there are plenty of ways for AT&T to expand network capacity without merging.

But there is an interesting twist to this line of reasoning. Notice how the “market is local” claim suddenly disappears. AT&T needs to take over a smaller national rival, according to Downes, so it can “accelerate deployment of nationwide mobile broadband using LTE technology, including expansion into rural areas.” Voila! Once we start talking about spectrum efficiencies and the promotion of universal service we take a nationwide perspective, not a local one. Doesn’t this obvious contradiction make anyone suspicious?

Notice also the ominous historical overtones of AT&T’s claim that it will be able to promote universal broadband service in rural areas if it has a stronger monopoly er, if it gains consolidation efficiencies. Hey, rural areas don’t have congested spectrum, do they? What’s stopping them from doing that now? If they need help to do it, where are the subsidies going to come from? Would more market power make that possible? One cannot help but ask: Is AT&T doing this to get more spectrum or is it trying to pull a neo-Theodore Vail, and promise the government that it will subsidize rural access if it has more market power?

Bottom line: this is one step too far back to the days of a single telephone company. If you support a competitive industry where one can reasonably expect the public and legislators to rely on market forces as the primary industry regulator, this merger has to be stopped. On the other hand, if you welcome the growing pressures for regulating carriers and making them the policemen and chokepoints for network control, a bigger AT&T is just what the doctor ordered.

]]>
https://techliberation.com/2011/04/18/why-i-fear-the-att-t-mobile-merger/feed/ 7 36335
GAC backs off TLD censorship a bit – but not enough https://techliberation.com/2011/02/24/gac-backs-off-tld-censorship-a-bit-but-not-enough/ https://techliberation.com/2011/02/24/gac-backs-off-tld-censorship-a-bit-but-not-enough/#comments Thu, 24 Feb 2011 17:43:30 +0000 http://techliberation.com/?p=35165

ICANN has posted an official “GAC Indicative Scorecard” in advance of the Feb. 28 showdown in Brussels between the Governmental Advisory Committee (GAC) and the ICANN Board. The “scorecard” is intended to identify the areas where the small number of governmental officials who participate in GAC differ from the positions developed by ICANN’s open policy development process. The scorecard constitutes a not-so subtle threat that ICANN should throw out its staff- and community-developed policies and make them conform to the GAC’s preferences. Amusingly, the so-called GAC position follows almost verbatim the text submitted as the “US position” back in January. It’s clear that the US calls the shots in GAC and that other governments, including the EU, are cast in the role of making minor modifications to U.S. initiatives.

There is one interesting modification, however. The new GAC scorecard still allows GAC to conduct an initial review of all new top level domain applications and still allows any GAC member to object to any string “for any reason.” But GAC has been publicly shamed into pulling back from the U.S. government’s recommendation that a single GAC objection, if not overruled by other governments, would kill the application. Instead, the GAC as a whole will “consider” any objection and develop written “advice” that will be forwarded to the Board. This would put such advice in the framework of ICANN’s bylaws, and thus the advice would not be binding on the board.

While it is heartening that public pressure has forced the governments to pull back from their more outrageous demands, the resulting procedure is still arbitrary and an unacceptable incursion on free expression and free markets. For a more complete analysis, see the IGP blog.

]]>
https://techliberation.com/2011/02/24/gac-backs-off-tld-censorship-a-bit-but-not-enough/feed/ 1 35165
Time to mobilize against a governmental takeover of DNS https://techliberation.com/2011/02/21/time-to-mobilize-against-a-governmental-takeover-of-dns/ https://techliberation.com/2011/02/21/time-to-mobilize-against-a-governmental-takeover-of-dns/#comments Mon, 21 Feb 2011 06:13:14 +0000 http://techliberation.com/?p=35100

For better or worse, my first post here is going to be a rather urgent call to action. I’d like to encourage everyone who reads this blog to register their support for this petition. Entitled, “Say no to the GAC veto,” it expresses opposition to a shocking and dangerous turn in U.S. policy toward the global domain name system. It is a change that would reverse more than a decade of commitment to a transnational, bottom-up, civil society-led approach to governance of Internet identifiers, in favor of a top-down policy making regime dominated by national governments.

If the U.S. Commerce Department has its way, not only would national governments call the shots regarding what new domains could exist and what ideas and words they could and could not use, but they would be empowered to do so without any constraint or guidance from law, treaties or constitutions. Our own U.S. Commerce Department wants to let any government in the world censor a top level domain proposal “for any reason.” A government or two could object simply because they don’t like the person behind it, the ideas it espouses or they are offended by the name, and make these objections fatal. This kind of direct state control over content-related matters sets an ominous precedent for the future of Internet governance.

On February 28 and March 1, ICANN and its Governmental Advisory Committee will meet in Brussels to negotiate over ICANN’s program to add new top level domain names to the root. The U.S. commerce Department has chosen to make this meeting a showdown, in which the so-called Governmental Advisory Committee (GAC) will demand that the organization re-write and re-do policies and procedures ICANN and its stakeholder groups have been laboring to achieve agreement on for the past six years. The GAC veto, assailed by our petition, is only the most objectionable feature of a long list of bad ideas our Commerce Department is dragging into the consultation. We need to make a strong showing to ensure that ICANN has the backbone to resist these pressures

For those concerned about the role of the state in communications and information, I can’t think of a better, clearer flashpoint for focusing your efforts. A great deal of the Internet’s innovation and revolutionary character came from the fact that it escaped control of national states and began to evolve new, transnational forms of governance. As governments wake up to the power and potential of the Internet, they have increasingly sought to assert traditional forms of control.

The relationship between national governments and ICANN, which came into being during the Clinton administration as an attempt to “privatize” and globalize the policy making and coordination of the Internet’s domain name system, has always been a fraught one. Whatever its flaws (and they are many), ICANN at least gives us a globalized governance regime that is rooted in the Internet’s technical commnunity and users, and one step removed from the miasma of national governments and intergovernmental organizations. The GAC was initially just an afterthought tacked on to ICANN’s structure to appease the European Union. It was – and is still supposed to be – purely advisory in function. Initially it was conceived as simply providing ICANN with information about the way its policies interacted with national policies.

Those of you with long memories may be feeling a sense of deja vu. Didn’t we think we were settling the issue of an intergovernmental takeover of ICANN back in 2005, during the World Summit on the Information Society? Wasn’t it the U.S. government who went into that summit playing to fears of a “UN takeover of the Internet” and swearing that it was protecting the Internet from “burdensome intergovernmental oversight and control”? Wouldn’t most Americans be surprised to learn that the Commerce Department is now using ICANN’s Governmental Advisory Committee to reassert intergovernmental control over what kind of new web sites can be created? Ironically, the US has become the most formidable world advocate of burdensome government oversight and control in Internet governance. And it has done so without any public consultation or legal authority.

Please spread the word about this petition and use whatever channels you have to isolate the Commerce Department’s illegitimate incursions on constitutional free expression guarantees.

]]>
https://techliberation.com/2011/02/21/time-to-mobilize-against-a-governmental-takeover-of-dns/feed/ 10 35100