Internet Governance & ICANN – Technology Liberation Front https://techliberation.com Keeping politicians' hands off the Net & everything else related to technology Thu, 20 Jan 2022 15:51:18 +0000 en-US hourly 1 6772528 New Jurimetrics Article: “Soft Law in U.S. ICT Sectors: Four Case Studies” https://techliberation.com/2021/02/01/new-jurimetrics-article-soft-law-in-u-s-ict-sectors-four-case-studies/ https://techliberation.com/2021/02/01/new-jurimetrics-article-soft-law-in-u-s-ict-sectors-four-case-studies/#comments Mon, 01 Feb 2021 21:02:45 +0000 https://techliberation.com/?p=76836

After a slight delay, Jurimetrics has finally published my latest law review article, “Soft Law in U.S. ICT Sectors: Four Case Studies.” It is part of a major symposium that Arizona State University (ASU) Law School put together on “Governing Emerging Technologies Through Soft Law: Lessons For Artificial Intelligence” for the journal. I was 1 of 4 scholars invited to pen foundational essays for this symposium. Jurimetrics is a official publication of the American Bar Association’s Section of Science & Technology Law.

This report was a major undertaking that involved dozens of interviews, extensive historic research, several events and presentations, and then numerous revisions before the final product was released. The final PDF version of the journal article is attached.

Here is the abstract:

Traditional hard law tools and processes are struggling to keep up with the rapid pace of innovation in many emerging technologies sectors. As a result, policy­makers in the United States rely increasingly on less formal “soft law” governance mech­anisms to address concerns surrounding many newer technologies. This Article explores four case studies from different information technology areas where soft law mechanisms have already been utilized to address governance concerns. These four sectoral case stud­ies include domain name management, content oversight, privacy policy, and cyberse­curity matters. After considering the various soft law mechanisms used to address those issues, the Article concludes with some general thoughts about the effectiveness of those approaches and what lessons those case studies might hold for the use of soft law in other emerging technology sectors and contexts.

]]>
https://techliberation.com/2021/02/01/new-jurimetrics-article-soft-law-in-u-s-ict-sectors-four-case-studies/feed/ 5 76836
3 takeaways from the Plenipot https://techliberation.com/2014/11/13/3-takeaways-from-the-plenipot/ https://techliberation.com/2014/11/13/3-takeaways-from-the-plenipot/#comments Thu, 13 Nov 2014 14:45:13 +0000 http://techliberation.com/?p=74962

Last week marked the conclusion of the ITU’s Plenipotentiary Conference, the quadrennial gathering during which ITU member states get together to revise the treaty that establishes the Union and conduct other high-level business. I had the privilege of serving as a member of the US delegation, as I did for the WCIT, and to see the negotiations first hand. This year’s Plenipot was far less contentious than the WCIT was two years ago. For other summaries of the conference, let me recommend to you Samantha Dickinson, Danielle Kehl, and Amb. Danny Sepulveda. Rather than recap their posts or the entire conference, I just wanted to add a couple of additional observations.

We mostly won on transparent access to documents

Through my involvement with WCITLeaks, I have closely followed the issue of access to ITU documents, both before and during the Plenipot. My assessment is that we mostly won.

Going forward, most inputs and outputs to ITU conferences and assemblies will be available to the public from the ITU website. This excludes a) working documents, b) documents related to other meetings such as Council Working Groups and Study Groups, and c) non-meeting documents that should be available to the public.

However, in February, an ITU Council Working Group will be meeting to develop what is likely to be a more extensive document access policy. In May, the whole Council will meet to provisionally approve an access policy. And in 2018, the next Plenipot will permanently decide what to do about this provisional access policy.

There are no guarantees, and we will need to closely monitor the outcomes in February and May to see what policy is adopted—but if it is a good one, I would be prepared to shut down WCITLeaks as it would become redundant. If the policy is inadequate, however, WCITLeaks will continue to operate until the policy improves.

I was gratified that WCITLeaks continued to play a constructive role in the discussion. For example, in the Arab States’ proposal on ITU document access, they cited us, considering “that there are some websites on the Internet which are publishing illegally to the public ITU documents that are restricted only to Member States.” In addition, I am told that at the CEPT coordination meeting, WCITLeaks was thanked for giving the issue of transparency at the ITU a shot in the arm.

A number of governments were strong proponents of transparency at the ITU, but I think special thanks are due to Sweden, who championed the issue on behalf of Europe. I was very grateful for their leadership.

The collapse of the WCIT was an input into a harmonious Plenipot

We got through the Plenipot without a single vote (other than officer elections)! That’s great news—it’s always better when the ITU can come to agreement without forcing some member states to go along.

I think it’s important to recognize the considerable extent to which this consensus agreement was driven by events at the WCIT in 2012. At the WCIT, when the US (and others) objected and said that we could not agree to certain provisions, other countries thought we were bluffing. They decided to call our bluff by engineering a vote, and we wisely decided not to sign the treaty, along with 54 other countries.

In Busan this month, when we said that we could not agree to certain outcomes, nobody thought we were bluffing. Our willingness to walk away at the WCIT gave us added credibility in negotiations at the Plenipot. While I also believe that good diplomacy helped secure a good outcome at the Plenipot, the occasional willingness to walk the ITU off a cliff comes in handy. We should keep this in mind for future negotiations—making credible promises and sticking to them pays dividends down the road.

The big question of the conference is in what form will the India proposal re-emerge

At the Plenipot, India offered a sweeping proposal to fundamentally change the routing architecture of the Internet so that a) IP addresses would be allocated by country, like telephone numbers, with a country prefix and b) domestic Internet traffic would never be routed out of the country.

This proposal was obviously very impractical. It is unlikely, in any case, that the ITU has the expertise or the budget to undertake such a vast reengineering of the Internet. But the idea would also be very damaging from the perspective of individual liberty—it would make nation-states, even more than the are now, mediators of human communication.

I was very proud that the United States not only made the practical case against the Indian proposal, it made a principled one. Amb. Sepulveda made a very strong statement indicating that the United States does not share India’s goals as expressed in this proposal, and that we would not be a part of it. This statement, along with those of other countries and subsequent negotiations, effectively killed the Indian proposal at the Plenipot.

The big question is in what form this proposal will re-emerge. The idea of remaking the Internet along national lines is unlikely to go away, and we will need to continue monitoring ITU study groups to ensure that this extremely damaging proposal does not raise its head.

]]>
https://techliberation.com/2014/11/13/3-takeaways-from-the-plenipot/feed/ 1 74962
Trust (but verify) the engineers – comments on Transatlantic digital trade https://techliberation.com/2014/09/28/trust-but-verify-the-engineers-comments-on-transatlantic-digital-trade/ https://techliberation.com/2014/09/28/trust-but-verify-the-engineers-comments-on-transatlantic-digital-trade/#comments Sun, 28 Sep 2014 18:29:33 +0000 http://techliberation.com/?p=74825

Last week, I participated in a program co-sponsored by the Progressive Policy Institute, the Lisbon Council, and the Georgetown Center for Business and Public Policy on “Growing the Transatlantic Digital Economy.”

The complete program, including keynote remarks from EU VP Neelie Kroes and U.S. Under Secretary of State Catherine A. Novelli, is available below.

My remarks reviewed worrying signs of old-style interventionist trade practices creeping into the digital economy in new guises, and urged traditional governments to stay the course (or correct it) on leaving the Internet ecosystem largely to its own organic forms of regulation and market correctives:

Vice President Kroes’s comments underscore an important reality about innovation and regulation. Innovation, thanks to exponential technological trends including Moore’s Law and Metcalfe’s Law, gets faster and more disruptive all the time, a phenomenon my co-author and I have coined “Big Bang Disruption.” Regulation, on the other hand, happens at the same pace (at best). Even the most well-intentioned regulators, and I certainly include Vice President Kroes in that list, find in retrospect that interventions aimed at heading off possible competitive problems and potential consumer harms rarely achieve their objectives, and, indeed, generate more harmful unintended consequences. This is not a failure of government. The clock speeds of innovation and regulation are simply different, and diverging faster all the time. The Internet economy has been governed from its inception by the engineering-driven multistakeholder process embodied in the task forces and standards groups that operate under the umbrella of the Internet Society.   Innovation, for better or for worse, is regulated more by Moore’s Law than traditional law. I happen to think the answer is “for better,” but I am not one of those who take that to the extreme in arguing that there is no place for traditional governments in the digital economy. Governments have and continue to play an essential part in laying the legal foundations for the remarkable growth of that economy and in providing incentives if not funding for basic research that might not otherwise find investors. And when genuine market failures appear, traditional regulators can and should step in to correct them as efficiently and narrowly as they can. Sometimes this has happened. Sometimes it has not. Where in particular I think regulatory intervention is least effective and most dangerous is in regulating ahead of problems—in enacting what the FCC calls “prophylactic rules.” The effort to create legally sound Open Internet regulations in the U.S. has faltered repeatedly, yet in the interim investment in both infrastructure and applications continues at a rapid pace—far outstripping the rest of the world. The results speak for themselves. U.S. companies dominate the digital economy, and, as Prof. Christopher Yoo has definitively demonstrated, U.S. consumers overall enjoy the best wired and mobile infrastructure in the world at competitive prices. At the same time, those who continue to pursue interventionist regulation in this area often have hidden agendas. Let me give three examples: 1.  As we saw earlier this month at the Internet Governance Forum, which I attended along with Vice President Kroes and 2,500 other delegates, representatives of the developing world were told by so-called consumer advocates from the U.S. and the EU that they must reject so-called “zero rated” services, in which mobile network operators partner with service providers including Facebook, Twitter and Wikimedia to provide their popular services to new Internet users without use applying to data costs. Zero rating is an extremely popular tool for helping the 2/3 of the world’s population not currently on the Internet get connected and, likely, from these services to many others. But such services violate the “principle” of neutrality that has mutated from an engineering concept to a nearly-religious conviction. And so zero rating must be sacrificed, along with users who are too poor to otherwise join the digital economy. 2.  Closer to home, we see the wildly successful Netflix service making a play to hijack the Open Internet debate into one about back-end interconnection, peering, and transit—engineering features that work so well that 99% of the agreements involved between networks, according to the OECD, aren’t even written down. 3.  And in Europe, there are other efforts to turn the neutrality principle on its head, using it as a hammer not to regulate ISPs but to slow the progress of leading content and service providers, including Apple, Amazon and Google, who have what the French Digital Council and others refer to as non-neutral “platform monopolies” which must be broken. To me, these are in fact new faces on very old strategies—colonialism, rent-seeking, and protectionist trade warfare respectively. My hope is that Internet users—an increasingly powerful and independent source of regulatory discipline in the Internet economy—will see these efforts for what they truly are…and reject them resoundingly. The more we trust (but also verify) the engineers, the faster the Internet economy will grow, both in the U.S. and Europe, and the greater our trade in digital goods and services will strengthen the ties between our traditional economies. It’s worked brilliantly for almost two decades. The alternatives, not so much.
]]>
https://techliberation.com/2014/09/28/trust-but-verify-the-engineers-comments-on-transatlantic-digital-trade/feed/ 4 74825
WCITLeaks is Ready for Plenipot https://techliberation.com/2014/09/26/wcitleaks-is-ready-for-plenipot/ https://techliberation.com/2014/09/26/wcitleaks-is-ready-for-plenipot/#respond Fri, 26 Sep 2014 19:23:16 +0000 http://techliberation.com/?p=74817

The ITU is holding its quadrennial Plenipotentiary Conference in Busan, South Korea from October 20 to November 7, 2014. The Plenipot, as it is called, is the ITU’s “supreme organ” (a funny term that I did not make up). It represents the highest level of decision making at the ITU. As it has for the last several ITU conferences, WCITLeaks will host leaked documents related to the Plenipot.

For those interested in transparency at the ITU, two interesting developments are worth reporting. On the first day of the conference, the heads of delegation will meet to decide whether documents related to the conference should be available to the public directly through the TIES system without a password. All of the documents associated with the Plenipot are already available in English on WCITLeaks, but direct public access would have the virtue of including those in the world who do not speak English but do speak one of the other official UN languages. Considering this additional benefit of inclusion, I hope that the heads of delegation will seriously consider the advantages of adopting a more open model for document access during this Plenipot. If you would like to contact the head of delegation for your country, you can find their names in this document. A polite email asking them to support open access to ITU documents might not hurt.

In addition, at the meeting, the ITU membership will consider a proposal from the United States to, as a rule, provide open access to all meeting documents.

open-access-ITU

This is what WCITLeaks has always supported—putting ourselves out of business. As the US proposal notes, the ITU Secretariat has conducted a study finding that other UN agencies are much more forthcoming in terms of public access to their documents. A more transparent ITU is in everyone’s interest—including the ITU’s. This Plenipot has the potential to remedy a serious deficiency with the institution; I’m cheering for them and hoping they get it right.

]]>
https://techliberation.com/2014/09/26/wcitleaks-is-ready-for-plenipot/feed/ 0 74817
Net Neutrality and the Dangers of Title II https://techliberation.com/2014/09/26/net-neutrality-and-the-dangers-of-title-ii/ https://techliberation.com/2014/09/26/net-neutrality-and-the-dangers-of-title-ii/#comments Fri, 26 Sep 2014 14:40:32 +0000 http://techliberation.com/?p=74788

There are several “flavors” of net neutrality–Eli Noam at Columbia University estimates there are seven distinct meanings of the term–but most net neutrality proponents agree that reinterpreting the 1934 Communications Act and “classifying” Internet service providers as Title II “telecommunications” companies is the best way forward. Proponents argue that ISPs are common carriers and therefore should be regulated much like common carrier telephone companies. Last week I filed a public interest comment about net neutrality and pointed out why the Title II option is unwise and possibly illegal.

For one, courts have defined “common carriers” in such a way that ISPs don’t look much like common carriers. It’s also unlikely that ISPs can be classified as telecommunications providers because Congress defines “telecommunications” as the transmission of information “between or among points specified by the user.” Phone calls are telecommunications because callers are selecting the endpoint–a person associated with the known phone number. Even simple web browsing, however, requires substantial processing by an ISP that often coordinates several networks, servers, and routers to bring the user the correct information, say, a Wikipedia article or Netflix video. Under normal circumstances, this process is completely mysterious to a user. By classifying ISPs as common carriers and telecommunications providers, therefore, the FCC invites immense legal risk.

As I’ve noted before, prioritized data can provide consumer benefits and stringent net neutrality rules would harm the development of new services on the horizon. Title II–in making the Internet more “neutral”–is anti-progress and is akin to putting the toothpaste back in the tube. The Internet has never been neutral, as computer scientist David Clark and others point out, and it’s getting less neutral all the time. VoIP phone service is already prioritized for millions of households. VoLTE will do the same for wireless phone customers.

It’s a largely unreported story that many of the most informed net neutrality proponents, including President Obama’s former chief technology officer, are fine with so-called “fast lanes”–particularly if it’s the user, not the ISP, selecting the services to be prioritized. There is general agreement that prioritized services are demanded by consumers, but Title II would have a predictable chilling effect on new services because of the regulatory burdens.

MetroPCS, for example, a small wireless carrier with about 3% market share attempted selling a purportedly non-neutral phone plan that allowed unlimited YouTube viewing and was pilloried for it by net neutrality proponents. MetroPCS, chastened, dropped the plan. With Title II, a small ISP or wireless carrier wouldn’t dream of attempting such a thing.

In the comment, I note other undesirable effects of Title II, including that it undermines the position the US has held publicly for years that the Internet is different than traditional communications.

If the FCC further intermingles traditional telecommunications with broadband, it may increase the probability of the [International Telecommunications Union] extending sender-pays or other tariffing and tax rules to the exchange of Internet traffic. Several countries proposed instituting sender-pays at a contentious 2012 ITU forum and the United States representatives vigorously fought sender-pays for the Internet. Many developing countries, particularly, would welcome such a change in regulations, because, as Mercatus scholar Eli Dourado found, sender-pays rules “allow governments to export some of their statutory tax burden.” New foreign tariffing rules would function essentially as a transfer of wealth from popular US-based companies like Facebook and Google to corrupt foreign governments and telephone cartels.

Finally, I note that classifying ISPs as common carriers weakens the enforcement of antitrust and consumer protection laws. Generally, it is difficult to bring antitrust lawsuits in extensively regulated industries. After filing my comment, I learned that the FTC also filed a comment noting, similarly, that its Section 5 authority would be limited if the FCC goes the Title II route. Brian Fung and others have since written about this interesting political and legal development. This detrimental effect on antitrust enforcement should weigh against Title II regulation.

There are substantial drawbacks to Title II regulation of ISPs and the FCC should exercise regulatory humility and its traditional hands-off approach to the Internet. In the end, Title II would harm investment in nascent technologies and network upgrades. The harms to consumers and small carriers, particularly, would be immense. It almost makes one think that comedy sketches and “death of the Internet” reporting don’t lead to good public policy.

More Information

See my presentation (36 minutes) on net neutrality and “fast lanes” on the Mercatus website.

]]>
https://techliberation.com/2014/09/26/net-neutrality-and-the-dangers-of-title-ii/feed/ 3 74788
Why Reclassification Would Make the Internet Less Open https://techliberation.com/2014/05/15/why-reclassification-would-make-the-internet-less-open/ https://techliberation.com/2014/05/15/why-reclassification-would-make-the-internet-less-open/#comments Thu, 15 May 2014 14:58:19 +0000 http://techliberation.com/?p=74555

There seems to be increasing chatter among net neutrality activists lately on the subject of reclassifying ISPs as Title II services, subject to common carriage regulation. Although the intent in pushing reclassification is to make the Internet more open and free, in reality such a move could backfire badly. Activists don’t seem to have considered the effect of reclassification on international Internet politics, where it would likely give enemies of Internet openness everything they have always wanted.

At the WCIT in 2012, one of the major issues up for debate was whether the revised International Telecommunication Regulations (ITRs) would apply to Operating Agencies (OAs) or to Recognized Operating Agencies (ROAs). OA is a very broad term that covers private network operators, leased line networks, and even ham radio operators. Since “OA” would have included IP service providers, the US and other more liberal countries were very much opposed to the application of the ITRs to OAs. ROAs, on the other hand, are OAs that operate “public correspondence or broadcasting service.” That first term, “public correspondence,” is a term of art that means basically common carriage. The US government was OK with the use of ROA in the treaty because it would have essentially cabined the regulations to international telephone service, leaving the Internet free from UN interference. What actually happened was that there was a failed compromise in which ITU Member States created a new term, Authorized Operating Agency, that was arguably somewhere in the middle—the definition included the word “public” but not “public correspondence”—and the US and other countries refused to sign the treaty out of concern that it was still too broad.

If the US reclassified ISPs as Title II services, that would arguably make them ROAs for purposes at the ITU (arguably because it depends on how you read the definition of ROA and Article 6 of the ITU Constitution). This potentially opens ISPs up to regulation under the ITRs. This might not be so bad if the US were the only country in the world—after all, the US did not sign the 2012 ITRs, and it does not use the ITU’s accounting rate provisions to govern international telecom payments.

But what happens when other countries start copying the US, imposing common carriage requirements, and classifying their ISPs as ROAs? Then the story gets much worse. Countries that are signatories to the 2012 ITRs would have ITU mandates on security and spam imposed on their networks, which is to say that the UN would start essentially regulating content on the Internet. This is what Russia, Saudia Arabia, and China have always wanted. Furthermore (and perhaps more frighteningly), classification as ROAs would allow foreign ISPs to forgo commercial peering arrangements in favor of the ITU’s accounting rate system. This is what a number of African governments have always wanted. Ethiopia, for example, considered a bill (I’m not 100 percent sure it ever passed) that would send its own citizens to jail for 15 years for using VOIP, because this decreases Ethiopian international telecom revenues. Having the option of using the ITU accounting rate system would make it easier to extract revenues from international Internet use.

Whatever you think of, e.g., Comcast and Cogent’s peering dispute, applying ITU regulation to ISPs would be significantly worse in terms of keeping the Internet open. By reclassifying US ISPs as common carriers, we would open the door to exactly that. The US government has never objected to ITU regulation of ROAs, so if we ever create a norm under which ISPs are arguably ROAs, we would be essentially undoing all of the progress that we made at the WCIT in standing up for a distinction between old-school telecom and the Internet. I imagine that some net neutrality advocates will find this unfair—after all, their goal is openness, not ITU control over IP service. But this is the reality of international politics: the US would have a very hard time at the ITU arguing that regulating for neutrality and common carriage is OK, but regulating for security, content, and payment is not.

If the goal is to keep the Internet open, we must look somewhere besides Title II.

]]>
https://techliberation.com/2014/05/15/why-reclassification-would-make-the-internet-less-open/feed/ 1 74555
NETmundial wrap-up https://techliberation.com/2014/04/25/netmundial-wrap-up/ https://techliberation.com/2014/04/25/netmundial-wrap-up/#respond Fri, 25 Apr 2014 12:58:24 +0000 http://techliberation.com/?p=74444

NETmundial is over; here’s how it went down. Previous installments (1, 2, 3).

  • The final output of the meeting is available here. It is being referred to as the Multistakeholder Statement of São Paulo. I think the name is designed to put the document in contention with the Tunis Agenda. Insofar as it displaces the Tunis Agenda, that is fine with me.
  • Most of the civil society participants are not happy. Contrary to my prediction, in a terrible PR move, the US government (among others) weakened the language on surveillance. A statement on net neutrality also did not make it into the final draft. These were the top two issues for most of civil society participants.
  • I of course oppose US surveillance, but I am not too upset about the watered down language since I don’t see this as an Internet governance issue. Also, unlike virtually all of the civil society people, I oppose net neutrality laws, so I’m pleased with that aspect of the document.
  • What bothers me most in the final output are two statements that seem to have been snuck in at the last moment by the drafters without approval from others. These are real shenanigans. The first is on multistakeholderism. The Tunis language said that stakeholders should participate according to their “respective roles and responsibilities.” The original draft of the NETmundial document used the same language, but participants agreed to remove it, indicating that all stakeholders should participate equally and that no stakeholders were more special than others. Somehow the final document contained the sentence, “The respective roles and responsibilities of stakeholders should be interpreted in a flexible manner with reference to the issue under discussion.” I have no idea how it got in there. I was in the room when the final draft was approved, and that text was not announced.
  • Similarly, language in the “roadmap” portion of the document now refers to non-state actors in the context of surveillance. “Collection and processing of personal data by state and non-state actors should be conducted in accordance with international human rights law.” The addition of non-state actors was also done without consulting anyone in the final drafting room.
  • Aside from the surveillance issue, the other big mistake by the US government was their demand to weaken the provision on intermediary liability. As I understand it, their argument was that they didn’t want to consider safe harbor for intermediaries without a concomitant recognition of the role of intermediaries in self-policing, as is done through the notice-and-takedown process in the US. I would have preferred a strong, free-standing statement on intermediary liability, but instead, the text was replaced with OECD language that the US had previously agreed to.
  • Overall, the meeting was highly imperfect—it was non-transparent, disorganized, inefficient in its use of time, and so on. I don’t think it was a rousing success, but it was nevertheless successful enough that the organizers were able to claim success, which I think was their original goal. Other than the two last-minute additions that I saw (I wonder if there are others), nothing in the document gives me major heartburn, so maybe that is actually a success. It will be interesting to see if the São Paulo Statement is cited in other fora, and if they decide to repeat this process again next year.
]]>
https://techliberation.com/2014/04/25/netmundial-wrap-up/feed/ 0 74444
NETmundial day 2 notes https://techliberation.com/2014/04/24/netmundial-day-2-notes/ https://techliberation.com/2014/04/24/netmundial-day-2-notes/#respond Thu, 24 Apr 2014 12:07:41 +0000 http://techliberation.com/?p=74434

Today is the second and final day of NETmundial and the third in my series (parts 1 and 2) of quick notes on the meeting.

  • Yesterday, Dilma Rousseff did indeed sign the Marco Civil into law as expected. Her appearance here began with the Brazilian national anthem, which is a very strange way to kick off a multistakeholder meeting.
  • The big bombshell in Rousseff’s speech was her insistence that the multilateral model can peacefully coexist with the multistakeholder model. Brazil had been making a lot of pro-multistakeholder statements, so many of us viewed this as something of a setback.
  • One thing I noticed during the speech was that the Portuguese word for “multistakeholder” actually literally translates as “multisectoral.” This goes a long way toward explaining some of the disconnect between Brazil and the liberals. Multisectoral means that representatives from all “sectors” are welcome, while multistakeholder implies that every stakeholder is welcome to participate, even if they sometimes organize into constituencies. This is a pretty major difference, and NETmundial has been organized on the former model.
  • The meeting yesterday got horribly behind schedule. There were so many welcome speeches, and they went so much over time, that we did not even begin the substantive work of the conference until 5:30pm. I know that sounds like a joke, but it’s not.
  • After three hours of substantive work, during which participants made 2-minute interventions suggesting changes to the text, a drafting group retreated to a separate room to work on the text of the document. The room was open to all participants, but only the drafting group was allowed to work on the drafting; everyone else could only watch (and drink).
  • As of this morning, we still don’t have the text that was negotiated last night. Hopefully it will appear online some time soon.
  • One thing to watch for is the status of the document. Will it be a “declaration” or a “chairman’s report” (or something else)? What I’m hearing is that most of the anti-multistakeholder governments like Russia and China want it to be a chairman’s report because that implies a lesser claim to legitimacy. Brazil, the hosts of the conference, presumably want to make a maximal claim to legitimacy. I tend to think that there’s enough wrong with the document that I’d prefer the outcome to be a chairman’s report, but I don’t feel too strongly.
]]>
https://techliberation.com/2014/04/24/netmundial-day-2-notes/feed/ 0 74434
NETmundial is about to begin https://techliberation.com/2014/04/23/netmundial-is-about-to-begin/ https://techliberation.com/2014/04/23/netmundial-is-about-to-begin/#respond Wed, 23 Apr 2014 12:55:09 +0000 http://techliberation.com/?p=74431

As I blogged last week, I am in São Paulo to attend NETmundial, the meeting on the future of Internet governance hosted by the Brazilian government. The opening ceremony is about to begin. A few more observations:

  • The Brazilian Senate passed the landmark Marco Civil bill last night, and Dilma Rousseff, the Brazilian president, may use here appearance here today to sign it into law. The bill subjects data stored on Brazilians anywhere in the world to Brazilian jurisdiction and imposes net neutrality domestically. It also provides a safe harbor for ISPs and creates a notice-and-takedown system for offensive content.
  • Some participants are framing aspects of the meeting, particularly the condemnation of mass surveillance in the draft outcome document, as civil society v. the US government. There is a lot of concern that the US will somehow water down the surveillance language so that it doesn’t apply to the NSA’s surveillance. WikiLeaks has stoked some of this concern with breathless tweets. I don’t see events playing out this way. I am as opposed to mass US surveillance as anyone, but I haven’t seen much resistance from the US government participants in this regard. Most of the comments by the US on the draft have been benign. For example, WikiLeaks claimed that the US “stripped” language referring to the UN Human Rights Council; in fact, the US hasn’t stripped anything because it is not in charge (it can only make suggestions), and eliminating the reference to the HRC is actually a good idea because the HRC is a multilateral, not a multistakeholder, body. I expect a strong anti-surveillance statement to be included in the final outcome document. If it is not, it will probably be other governments, not the US, that block it.
  • In my view, the privacy section of the draft still needs work, however. In particular, it is important to cabin the paragraph to address governmental surveillance, not to interfere with voluntary, private arrangements in which users disclose information to receive free services.
  • I expect discussions over net neutrality to be somewhat contentious. Civil society participants are generally for it, with some governments, businesses, parts of the technical community, and yours truly opposed.
  • Although surveillance and net neutrality have received a lot of attention, they are not the important issues at NETmundial. Instead, look for the language that will affect “the future of Internet governance,” which is after all what the meeting is about. For example, will the language on stakeholders’ “respective roles and responsibilities” be stricken? This is language held over from the Tunis Agenda and it has a lot of meaning. Do stakeholders participate as equals or do they, especially governments, have separate roles? There is also a paragraph on “enhanced cooperation,” which is a codeword for governments running the show. Look to see in the final draft if it is still there.
  • Speaking of the final draft, here is how it will be produced: During the meeting, participants will have opportunities to make 2-minute interventions on specific topics. The drafting group will make note of the comments and then retreat to a drafting room to make final edits to the draft. This is, of course, not really the open governance process that many of us want for the Internet, one where select, unaccountable participants have the final say. Yet two days is not a long enough time to really have an open, free-wheeling drafting conference. I think the structure of the conference, driven by the perceived need to produce an outcome document with certainty, is unfortunate and somewhat detracts from the legitimacy of whatever will be produced, even though I expect the final document to be OK on substance.
]]>
https://techliberation.com/2014/04/23/netmundial-is-about-to-begin/feed/ 0 74431
Pre-NETmundial Notes https://techliberation.com/2014/04/18/pre-netmundial-notes/ https://techliberation.com/2014/04/18/pre-netmundial-notes/#comments Fri, 18 Apr 2014 14:29:46 +0000 http://techliberation.com/?p=74411

Next week I’ll be in São Paulo for the NETmundial meeting, which will discuss “the future of Internet governance.” I’ll blog more while I’m there, but for now I just wanted to make a few quick notes.

  • This is the first meeting of its kind, so it’s difficult to know what to expect, in part because it’s not clear what others’ expectations are. There is a draft outcome document, but no one knows how significant it will be or what weight it will carry in other fora.
  • The draft outcome document is available here. The web-based tool for commenting on individual paragraphs is quite nice. Anyone in the world can submit comments on a paragraph-by-paragraph basis. I think this is a good way to lower the barriers to participation and get a lot of feedback.
  • I worry that we won’t have enough time to give due consideration to the feedback being gathered. The meeting is only two days long. If you’ve ever participated in a drafting conference, you know that this is not a lot of time. What this means, unfortunately, is that the draft document may be something of a fait accompli. Undoubtedly it will change a little, but the amount of changes that can be contemplated will be limited due to sheer time constraints.
  • Time will be even more constrained by the absurd amount of time allocated to opening ceremonies and welcome remarks. The opening ceremony begins at 9:30 am and the welcome remarks are not scheduled to conclude until 1 pm on the first day. This is followed by a lunch break, and then a short panel on setting goals for NETmundial, so that the first drafting session doesn’t begin until 2:30 pm. This seems like a mistake.
  • Speaking of the agenda, it was not released until yesterday. While NETmundial has indeed been open to participation by all, it has not been very transparent. An earlier draft outcome document had to be leaked by WikiLeaks on April 8. Not releasing an agenda until a few days before the event is also not very transparent. In addition, the processes by which decisions have been made have not been transparent to outsiders.

See you all next week.

]]>
https://techliberation.com/2014/04/18/pre-netmundial-notes/feed/ 1 74411
How to Privatize the Internet https://techliberation.com/2014/04/02/how-to-privatize-the-internet/ https://techliberation.com/2014/04/02/how-to-privatize-the-internet/#comments Wed, 02 Apr 2014 15:52:08 +0000 http://techliberation.com/?p=74378

Today on Capitol Hill, the House Energy and Commerce Committee is holding a hearing on the NTIA’s recent announcement that it will relinquish its small but important administrative role in the Internet’s domain name system. The announcement has alarmed some policymakers with a well-placed concern for the future of Internet freedom; hence the hearing. Tomorrow, I will be on a panel at ITIF discussing the IANA oversight transition, which promises to be a great discussion.

My general view is that if well executed, the transition of the DNS from government oversight to purely private control could actually help secure a measure of Internet freedom for another generation—but the transition is not without its potential pitfalls.

The NTIA’s technical administration of the DNS’ “root zone” is an artifact of the Internet’s origins as a U.S. military experiment. In 1989, the government began the process of privatizing the Internet by opening it up to general and commercial use. In 1998, the Commerce Department created ICANN to oversee the DNS on a day-to-day basis. The NTIA’s announcement is arguably the culmination of this single decades-long process of privatization.

The announcement also undercuts the primary justification used by authoritarian regimes to agitate for control of the Internet. Other governments have long cited the United States’ unilateral control of the root zone, arguing that they, too, should have roles in governing the Internet. By relinquishing its oversight of the DNS, the United States significantly undermines that argument and bolsters the case for private administration of the Internet.

The United States’ stewardship of the root zone is largely apolitical. This apolitical approach to DNS administration is precisely what is at stake during the transition, hence the three pitfalls the Obama administration must avoid to preserve it.

The first pitfall is the most serious but also the least likely to materialize. Despite the NTIA’s excellent track record, authoritarian regimes like Russia, China, and Iran have long lobbied for the ITU, a clumsy and heavily politicized U.N. technical agency, to take over the NTIA’s duties. In its announcement, the NTIA said it would not accept a proposal from an intergovernmental organization, a clear rebuke to the ITU.

Nevertheless, liberal governments would be wise to send the organization a clear message in the form of much-needed reform. The ITU should adopt the transparency we expect of communications standards bodies, and it should focus on its core competency—international coordination of radio spectrum—instead of on Internet governance. If the ITU resists these reforms at its Plenipotentiary Conference this fall, the United States and other countries should slash funding or quit the Union.

ICANN’s Governmental Advisory Committee (GAC) presents a second pitfall. Indeed, the GAC is already the source of much mischief. For example, France and Luxembourg objected to the creation of the .vin top-level domain on the grounds that “vin” (wine) is a regulated term in those countries. Brazil and Peru have held up Amazon.com’s application for .amazon despite the fact that they previously agreed to the list of reserved place names, and rivers and states were not on it. Last July, the U.S. government, reeling from the Edward Snowden revelations, threw Amazon and the rule of law under the bus at the GAC as a conciliatory measure.

ICANN created the GAC to appease other governments in light of the United States’ outsized role. Since the United States is giving up its special role, the case for the GAC is much diminished. In practice, the limits on the GAC’s power are gradually eroding. ICANN’s board seems increasingly hesitant to overrule it out of fear that governments will go back to the ITU and complain that the GAC “isn’t working.” As part of the transition of the root zone to ICANN, therefore, new limits need to be placed on the GAC’s power. Ideally, it would dissolve the GAC.

The third pitfall comes from ICANN itself. The organization is awash in cash from domain registration fees and new top-level domain name applications—which cost $185,000 each—and when the root zone transition is completed, it will face no external accountability. Long-time ICANN insiders speak of “mission creep,” noting that the supposedly purely technical organization increasingly deals with trademark policy and has aided police investigations in the past, a dangerous precedent.

How can we prevent an unaccountable, cash-rich technical organization from imposing its own internal politics on what is supposed to be an apolitical administrative role? In the long run, we may never be able to stop ICANN from becoming a government-like entity, which is why it is important to support research and experimentation in peer-to-peer, decentralized domain name systems. This matter is under discussion, among other places, at the Internet Engineering Task Force, which may ultimately play something of a counterweight to an independent ICANN.

Despite these potential pitfalls, it is time for an Internet that is fully in private hands. The Obama administration deserves credit for proposing to complete the privatization of the Internet, but we must also carefully monitor the process to intercept any blunders that might result in politicization of the root zone.

]]>
https://techliberation.com/2014/04/02/how-to-privatize-the-internet/feed/ 3 74378
Toward a Post-Government Internet https://techliberation.com/2014/03/17/toward-a-post-government-internet/ https://techliberation.com/2014/03/17/toward-a-post-government-internet/#comments Mon, 17 Mar 2014 13:41:53 +0000 http://techliberation.com/?p=74294

The Internet began as a U.S. military project. For two decades, the government restricted access to the network to government, academic, and other authorized non-commercial use. In 1989, the U.S. gave up control—it allowed private, commercial use of the Internet, a decision that allowed it to flourish and grow as few could imagine at the time.

Late Friday, the NTIA announced its intent to give up the last vestiges of its control over the Internet, the last real evidence that it began as a government experiment. Control of the Domain Name System’s (DNS’s) Root Zone File has remained with the agency despite the creation of ICANN in 1998 to perform the other high-level domain name functions, called the IANA functions.

The NTIA announcement is not a huge surprise. The U.S. government has always said it eventually planned to devolve IANA oversight, albeit with lapsed deadlines and changes of course along the way.

The U.S. giving up control over the Root Zone File is a step toward a world in which governments no longer assert oversight over the technology of communication. Just as freedom of the printing press was important to the founding generation in America, an unfettered Internet is essential to our right to unimpeded communication. I am heartened to see that the U.S. will not consider any proposal that involves IANA oversight by an intergovernmental body.

Relatedly, next month’s global multistakeholder meeting in Brazil will consider principles and roadmaps for the future of Internet governance. I have made two contributions to the meeting, a set of proposed high-level principles that would limit the involvement of governments in Internet governance to facilitating participation by their nationals, and a proposal to support experimentation in peer-to-peer domain name systems. I view these proposals as related: the first keeps governments away from Internet governance and the second provides a check against ICANN simply becoming another government in control of the Internet.

]]>
https://techliberation.com/2014/03/17/toward-a-post-government-internet/feed/ 1 74294
Get Real(ist): Don’t confuse NSA regulation with Internet regulation https://techliberation.com/2013/10/27/get-realist-dont-confuse-nsa-regulation-with-internet-regulation/ https://techliberation.com/2013/10/27/get-realist-dont-confuse-nsa-regulation-with-internet-regulation/#comments Sun, 27 Oct 2013 15:26:09 +0000 http://techliberation.com/?p=73733

In her UN General Assembly speech denouncing NSA surveillance, Brazil’s President Dilma Rousseff said:

Information and communications technologies cannot be the new battlefield between States. Time is ripe to create the conditions to prevent cyberspace from being used as a weapon of war, through espionage, sabotage, and attacks against systems and infrastructure of other countries. … For this reason, Brazil will present proposals for the establishment of a civilian multilateral framework for the governance and use of the Internet and to ensure the protection of data that travels through the web.

We share her outrage at mass surveillance. We share her opposition to the militarization of the Internet. We share her concern for privacy.

But when President Rousseff proposes to solve these problems by means of a “multilateral framework for the governance and use of the Internet,” she reveals a fundamental flaw in her thinking. It is a flaw shared by many in civil society.

You cannot control militaries, espionage and arms races by “governing the Internet.” Cyberspace is one of many aspects of military competition. Unless one eliminates or dramatically diminishes political and military competition among sovereign states, states will continue to spy, break into things, and engage in conflict when it suits their interests. Cyber conflict is no exception.

Rousseff is mixing apples and oranges. If you want to control militaries and espionage, then regulate arms, militaries and espionage – not “the Internet.”

This confusion is potentially dangerous. If the NSA outrages feed into a call for global Internet governance, and this governance focuses on critical Internet resources and the production and use of Internet-enabled services by civil society and the private sector, as it inevitably will, we are certain to get lots of governance of the Internet, and very little governance of espionage, militaries, and cyber arms.

In other words, Dilma’s “civilian multilateral framework for the governance and use of the Internet” is only going to regulate us – the civilian users and private sector producers of Internet products and services. It will not control the NSA, the Chinese Peoples Liberation Army, the Russian FSB or the British GCHQ.

Realism in international relations theory is based on the view that the international system is anarchic. This does not mean that it is chaotic, but simply that the system is composed of independent states and there is no central authority capable of coercing all of them into following rules. The other key tenet of realism is that the primary goal of states in the international system is their own survival.

It follows that the only way one state can compel another state to do anything is through some form of coercion, such as war, a credible threat of war, or economic sanctions. And the only time states agree to cooperate to set and enforce rules, is when it is in their self-interest to do so. Thus, when sovereign states come together to agree to regulate things internationally, their priorities will always be to:

  • Preserve or enlarge their own power relative to other states; and
  • Ensure that the regulations are designed to bring under control those aspects of civil society and business that might undermine or threaten their power.

Any other benefits, such as privacy for users or freedom of expression, will be secondary concerns. That’s just the way it is in international relations. Asking states to prevent cyberspace from being used as a weapon of war is like asking foxes to guard henhouses.

That’s one reason why it is so essential that these conferences be fully open to non-state actors, and that they not be organized around national representation.

Let’s think twice about linking the NSA reaction too strongly to Internet governance. There is some linkage, of course. The NSA revelations should remind us to be realist in our approach to Internet governance. This means recognizing that all states will approach Internet regulation with their own survival and power uppermost in their agenda; it also means that any single state cannot be trusted as a neutral steward of the global Internet but will inevitably use its position to benefit itself. These implications of the Snowden revelations need to be recognized. But let us not confuse NSA regulation with Internet regulation.

]]>
https://techliberation.com/2013/10/27/get-realist-dont-confuse-nsa-regulation-with-internet-regulation/feed/ 3 73733
IGF Day 3: Unanswered questions https://techliberation.com/2013/10/24/igf-day-3-unanswered-questions/ https://techliberation.com/2013/10/24/igf-day-3-unanswered-questions/#respond Thu, 24 Oct 2013 23:29:23 +0000 http://techliberation.com/?p=73731

The forum has largely been overtaken by discussion of ICANN’s move to organize a new Internet governance coalition. ICANN representatives have had both open- and closed-door meetings to push the proposal, but there are still many questions that have not been adequately answered.

One important question is about the private discussions that have led to this. The I-stars came out at least nominally aligned on this issue, though there is speculation that they are not all totally unified. Over drinks, I mentioned to an ICANN board member that it rubs a lot of people in civil society the wrong way that the I-stars seem to have coordinated on this in private. He replied that I was probably assuming too much about the level of coordination. If that’s the case, then I wonder if we will hear more from the other I-stars about their level of support for ICANN’s machinations.

More basically, we still don’t know much about the Rio non-summit. It will be in Rio, it will be in May, there will be some sort of output document. But we don’t know the agenda, or the agenda-setting process, or even the process for setting an agenda-setting process.

And strategically, we don’t know how the Brazil meeting is going to affect all of the other parts of the take-over-the-Internet industry in the coming year. The CWG-Internet happens next month, and they will take up Brazil’s proposal from the WTPF. But since Brazil is positioning itself as a leader in this new process (and aligned with ICANN now), what will they try to get at the CWG? WTDC is in March-April. And of course the Plenipot will be in the fall next year. If the Brazil summit is perceived to have failed in any sense, will that make the battle at Plenipot even more intense?

Also, whose idea was it to have a gala without alcohol?

]]>
https://techliberation.com/2013/10/24/igf-day-3-unanswered-questions/feed/ 0 73731
IGF Day 2: The Coalition https://techliberation.com/2013/10/23/igf-day-2-the-coalition/ https://techliberation.com/2013/10/23/igf-day-2-the-coalition/#respond Thu, 24 Oct 2013 00:09:42 +0000 http://techliberation.com/?p=73711

As expected, today at 1pm there was a packed, off-the-books meeting facilitated by the “I-star” organizations (ICANN, ISOC, IETF, and a bunch of groups that don’t begin with I). The purpose of the meeting was to build support for a new Internet governance “coalition.” The argument is that because of the NSA’s global surveillance programs, the US is losing support for its perceived leadership on Internet governance. In order to avoid greater governmental or intergovernmental intrusion into the Internet, the technical community, as signaled in the Montevideo statement, must go on the offensive and create an alternative to such intrusion.

This argument is controversial, to say the least. To what extent does the “offensive” entail creating a top-down institution to deal with Internet policy issues? Neither the technical community nor civil society wants government to be in charge of the Internet, but the technical community (especially ICANN) seems much more comfortable with top-down non-governmental control. I worry that ICANN is going to become increasingly government-like. In any case, we are witnessing a small but historic rift between civil society and the technical community, which have always been on the same side in the war to keep governments off the Internet.

Even if ICANN’s argument makes a kind of sense, it may be reckless to pursue it in the proposed way. It’s now looking like there will be a don’t-call-it-a-summit in Rio in early May, hosted by the Brazilian government, to discuss these issues. Even if ICANN has good reason to believe that Brazil is negotiating in good faith, there is always the possibility that Brazil gets what it wants in the end. They are not likely to just roll over.

I’m open to the idea that we need an affirmative answer to the question of Internet policy institutions. But I’d feel a lot more comfortable if such institutions evolved bottom-up rather than emerging from a grand push, organized secretly by some members of the technical community, to create an alternative. Hopefully with the creation of the new coalition mailing list, everything can be done out in the open from here on out.

]]>
https://techliberation.com/2013/10/23/igf-day-2-the-coalition/feed/ 0 73711
Day 1 of IGF: “What do you think about the Brazil meeting?” https://techliberation.com/2013/10/22/day-1-of-igf-what-do-you-think-about-the-brazil-meeting/ https://techliberation.com/2013/10/22/day-1-of-igf-what-do-you-think-about-the-brazil-meeting/#respond Tue, 22 Oct 2013 14:15:53 +0000 http://techliberation.com/?p=73705

Day 1 of the Internet Governance Forum is in the books, and everyone is talking about what will happen on Day 2. Brazil recently announced that it will host a meeting on Internet governance in April. Tomorrow, ICANN is hosting a meeting at 1pm to explain how the April meeting will work.

Everyone that I’ve talked to in the hallways has brought up the meeting in April. No one is quite sure what to expect.

On one hand, Brazil has been part of the coalition that is pushing to do more Internet governance at the ITU. On the other hand, ICANN seems to be a willing participant in Brazil’s scheme. The recent “Montevideo Statement,” issued by various Internet organizations, called for globalizing the IANA function, which means at a minimum removing the US’s special role of maintaining the domain name system’s root zone file.

ICANN wants independence from the US government, and Brazil wants ICANN to be independent from the US government (and possibly dependent on the ITU), so this makes them allies for now.

Bizarrely, NSA surveillance continues to be cited as a reason for Brazil’s actions, although of course the IANA function has nothing to do with surveillance. The IANA issue is mostly about status. Other governments seem to feel slighted by the US’s control of the root zone file.

In any case, tomorrow we may know slightly more about ICANN and Brazil’s schemes.

]]>
https://techliberation.com/2013/10/22/day-1-of-igf-what-do-you-think-about-the-brazil-meeting/feed/ 0 73705
Pakistan’s Statement on NSA Surveillance, Human Rights, and Internet Governance https://techliberation.com/2013/09/18/pakistans-statement-on-nsa-surveillance-human-rights-and-internet-governance/ https://techliberation.com/2013/09/18/pakistans-statement-on-nsa-surveillance-human-rights-and-internet-governance/#respond Wed, 18 Sep 2013 18:53:44 +0000 http://techliberation.com/?p=73552

Last month, I wrote at The Guardian that NSA surveillance is harming our Internet freedom efforts. Now we have tangible evidence of that. Speaking at the UN Human Rights Council on behalf of Cuba, Venezuela, Zimbabwe, Uganda, Ecuador, Russia, Indonesia, Bolivia, Iran, and China, Pakistan delivered the following statement (video, starts around 52:25). Pay special attention to the last two paragraphs:

Mr. President,

Freedom of expression is a fundamental human right. The right to privacy is an essential element of the right to freedom of expression as defined in the ICCPR. At the last session of the Human Rights Council, the Special Rapporteur on Freedom of Expression presented his report which focused on the right to privacy and freedom of expression and opinion in the context of states surveillance of communications systems.

We believe that this is an area of great concern, particularly in light of recent revelations regarding the use and abuse of advanced surveillance technologies by some states. These involve unilateral unauthorized access to private data and performing extensive, in-depth surveillance on live communications and stored information with examples including email, voice chat, videos, photos, file transfers, and social networking details. The extent of recent events of mass data collection has been far greater than the global community knew and is a serious violation of the right to privacy as well as domestic and international law.

Much of the world’s electronic communications pass through only one country because electronic communications data tend to follow the least expensive route rather than the most physically direct route, and the bulk of the world’s Internet infrastructure is also based there. This provides an opportunity for intercepting the private communications of foreign nationals as their electronic data pass into or through that one country.

This situation is further aggravated when we see several major international internet and telecommunication technology companies overstepping in privacy and information theft including companies like Microsoft, Yahoo, Google, Facebook, YouTube, AOL, Skype and Apple. Some of these entities have been developing and executing their own surveillance capabilities and intruding into the private space of their customers around the globe without their knowledge or consent. As the Special Rapporteur has mentioned in his report, the private sector corporations often facilitate some states in their surveillance of individuals and states are increasingly adopting legislation requiring communications service providers direct access to the communications data. This is a disturbing development because this is intrusion of privacy on a mega-scale. It means that states can use these technologies and data against persons who are not their citizens and do not reside in their borders. This has complicated legal and human rights implications.

Article 12 of the Universal Declaration of Human Rights, and numerous international statutes and treaties forbid such systems of massive, pervasive surveillance. More importantly, the international community needs to take urgent action to protect individuals from such violation of their fundamental freedom.

A transparent international system with adequate international framework of internet governance including appropriate safeguards is all the more important in such circumstances. The internet is too big, too international and too much of a household thing to be left operating by a few who have misused it without any international legislation and monitoring of these abuses.

The existing mechanisms like the Internet Governance Forum established under the paragraph 72 of the World Summit on Information Society (WSIS)-Tunis Agenda have not been able to deliver the desired results. A strategic rethinking of the global internet governance mechanism is inevitable. Further development of an international mechanism in the context of ‘Enhanced cooperation’ within the WSIS Tunis Agenda can be a concrete way forward. However we will need to be sincere in our efforts to ensure a transparent, free, fair and respectful international intergovernmental mechanism of internet governance and one that also ensures the right to privacy.

]]>
https://techliberation.com/2013/09/18/pakistans-statement-on-nsa-surveillance-human-rights-and-internet-governance/feed/ 0 73552
Net Neutrality Returns – As Farce https://techliberation.com/2013/09/11/net-neutrality-returns-as-farce/ https://techliberation.com/2013/09/11/net-neutrality-returns-as-farce/#respond Wed, 11 Sep 2013 17:20:16 +0000 http://techliberation.com/?p=73530

Over on Forbes today, I have a very long post inspired by Monday’s oral arguments in Verizon’s challenge of the FCC’s Open Internet rules, passed in 2010

I say “inspired” because the post has nothing to say about the oral arguments which, in any case, I did not attend.  Mainstream journalists can’t resist the temptation to try to read into the questions asked or the mood of the judges some indication of how the decision will come out

But as anyone who has ever worked in a court or followed appellate practice  well knows, the tone of oral arguments signals nothing about a judge’s point-of-view.  Often, the harshest questioning is reserved for the side a judge is leaning towards supporting, perhaps because the briefs filed were inadequate.  Bad briefs create more work for the judge and her clerks.

I use the occasion of the hearing to take a fresh look at the net neutrality “debate,” which has been on-going since at least 2005, when I first started paying attention to it.  In particular, I try to disentangle the political term “net neutrality” (undefined and, indeed, not even used in the 2010 Open Internet order) from the engineering principles of packet routing.

According to advocates for government regulation of broadband access, the political argument for net neutrality regulation is simply a codification of the Internet’s design.  But regardless of whether it would even make sense to transform the FCC into the governing body of engineering protocols for the network (the Internet Society and the its engineering task forces are and always have been doing a fine job, thanks very much), the reality is that the political argument has almost nothing to do with the underlying engineering.

Indeed, those most strongly advocating for more government regulation either don’t understand the engineering or intentionally mischaracterize it, or both.  That’s clear from the wide range of supposed competitive problems that have been lumped together under the banner of “net neutrality” issues over the years–almost none of which have anything to do with packet routing.

Fortunately, very little of the larger political agenda of the loose coalition of net neutrality advocates is reflected in the rules ultimately passed by a bare majority of the FCC in 2010.  Even so, those rules, limited as they were, face many challenges.

For one thing, the FCC, despite over a year of dedicated attention to the problem, could identify only four incidents that suggested any kind of market failure, and only one of which (the Comcast-BitTorrent incident) was ever actually considered in detail by the Commission.  (Two of the others never even rose to the level of a complaint.)  The agency was left to regulate on the basis of “preserving” the Open Internet through what it called (nearly a dozen times) “prophylactic” rules.

Second, and of particular interest in the D.C. Circuit proceeding, Congress has never authorized the FCC to issue rules dealing with broadband Internet access.  Though many authorizing bills have circulated over the years, none have ever made it out of committee.  With no legal basis to regulate, the agency was left pointing to irrelevant provisions of the existing Communications Act–most of which were already rejected by the same court in the Comcast case.  Nothing in the law has changed since Comcast, and on that basis, regardless of the merits of Internet regulation, the FCC is very likely to lose.  Which the Commission surely knew in passing the rules in 2010.

The piece ends by describing, as I did in my testimony before the House Judiciary Committee in early 2011, how the Report and Order betray the technical reality that from an engineering standpoint, even the supposed neutrality of packet routing is largely a sentimental myth.  The FCC identified and exempted a dozen network management technologies, practices, and protocols that they acknowledged do not follow the neutrality principle, but which are essential to effective and efficient management of the network.  There is no “neutral” Internet to preserve, and never was.

The agency was right to exempt these practices.  But the problem with the rules as written is that they could not and did not extend to future innovations that new applications and new users will certainly make as essential as today’s management techniques.

If the rules stand, network engineers, application developers, device makers and others in the vibrant, dynamic Internet ecosystem will be forced to seek permission to innovate from the FCC, which will both slow the high-speed world of Internet design to a crawl and introduce a decision maker with no technical expertise and lots of political baggage.

That of course was the kind of counter-productive and unnecessary regulatory intrusion that Internet users successfully rose up against last year when the UN’s International Telecommunications Union threatened to assert itself in basic Internet governance, or the year before that when Congress, without technical understanding of the most basic variety, tried to re-architect the Internet  on behalf of media companies in the failed SOPA and PIPA legislation.

If the FCC gains a foothold in broadband access with the Open Internet rules or other efforts to gain oversight where Congress has delegated none, expect a similar reaction.  Or, in any case, hope for one.

]]>
https://techliberation.com/2013/09/11/net-neutrality-returns-as-farce/feed/ 0 73530
Book Review: Anupam Chander’s “Electronic Silk Road” https://techliberation.com/2013/08/24/book-review-anupam-chanders-electronic-silk-road/ https://techliberation.com/2013/08/24/book-review-anupam-chanders-electronic-silk-road/#comments Sat, 24 Aug 2013 21:53:09 +0000 http://techliberation.com/?p=73472

Electronic Silk Road book coverAs I’ve noted before, I didn’t start my professional life in the early 1990s as a tech policy wonk. My real passion 20 years ago was free trade policy. Unfortunately for me, as my boss rudely informed me at the time, the world was already brimming with aspiring trade analysts and probably didn’t need another. This was the time of NAFTA and WTO negotiations and seemingly everybody was lining up to get into the world of trade policy during that period.

And so, while I was finishing a master’s degree with trade theory applications and patiently hoping for opportunities to open up, I decided to take what I thought was going to be a brief detour into the strange new world of the Internet and information technology policy. Of course, I never looked backed. I was hooked on Net policy from Day 1.  But I never stopped caring about trade theory and I have always remained passionate about the essential role that free trade plays in expanding commerce, improving human welfare, and facilitating more peaceful interactions among the diverse cultures and countries of this planet.

I only tell you this part of my own backstory so that you understand why I was so excited to receive a copy of Anupam Chander’s new book, The Electronic Silk Road: How the Web Binds the World Together in Commerce. Chander’s book weaves together trade theory and modern information technology policy issues. His over-arching goal is to sketch out and defend “a middle ground between isolation and unregulated trade, embracing free trade and also its regulation.” (p. 209)

In a writing style that is clear and direct, Chander explores the competing forces that facilitate and threaten what he refers to as “Trade 2.0.”  He identifies four distinctive legal challenges for “net-work,” which is his generic descriptor for “information services delivered remotely through electronic communications systems.” (p. 2):

  1. “Legal roadblocks to the free flow of net-work;
  2. The lack of adequate legal infrastructure, as compared to trade in traditional goods;
  3. The threat to law itself posed by the footloose nature of net-work and the uncertainty of whose law should govern net-work transactions; and
  4. The danger that local control of net-work might lead to either Balkanization – the disintegration of the World Wide Web into local arenas – or Stalinization – the repression of political dissidents, identified through their online activity by compliant net-work service providers.” (p. 143).

At the heart of the book is an old tension that has long haunted trade policy: How do you achieve the benefits of free trade through greater liberalization without completely undermining the sovereign authority of nation-states to continue enforcing their preferred socio-political legal and cultural norms? After all, as Chander notes, “States will be loathe to abandon their law in the face of the offerings mediated by the Internet.” (p. 34)  “If crossborder flows of information grossly undermine our privacy, security, or the standards of locally delivered services, they will not long be tolerated,” he notes. (p. 173)  These are just a few of the reasons that barriers to trade remain and why, as Chander explains, “the flat world of global business and the self-regulating world of cyberspace remain distant ideals.” (p. 173).

Striking the Balance

Chander wants to counter that impulse to expand the horizons of Trade 2.0, but he argues that, to some extent, nation-states will always need to be appeased along the way. Consequently, he argues that “we must dismantle the logistical and regulatory barriers to net-work trade while at the same time ensuring that public policy objectives cannot easily be evaded through simple jurisdictional sleight of hand or keystroke.” (p. 34) Again, this reflects his desire for both greater liberalization of markets as well as the preservation of a residual role for states in shaping online commerce and activities.

He says we can achieve this Goldilocks-like balance through the application of three key principles.

The first is harmonization of laws and policies, preferably through multinational accords. “Efforts to harmonize laws across nations and standards among professional associations will prove essential to preserve a global cyberspace in the face of national regulation,” Chander insists. (p. 187)

The second principle is “glocalization,” or “the creation or distribution of products or services intended for a global market but customized to conform to local laws — within the bounds of international law.” (p. 169)

The final key principle is more self-regulatory in character. It is the operational norm of “do no evil” as it pertains to requests from repressive states to have Internet intermediaries to crack down on free speech or privacy.  “[W]e must seek to nurture a corporate consciousness among information providers of their role in liberation or oppression,” Chander argues. (p. 205)

In a sense, what Chander is recommending here is largely the way global information markets already work. Thus, instead of being aspirational, Chander’s book is actually just more descriptive of the reality we see on the ground today.

For example, the harmonization efforts he recommends to facilitate Trade 2.0 have been underway in various fora and trade accords for several years now. Chander does a nice job describing many of those efforts in the book.

Likewise, his “glocalization” recommendation is to some extent already today’s norm. After a series of high-profile legal skirmishes over the past dozen years, Internet giants such as Yahoo, Google, Facebook, Cisco, Microsoft and others have all eventually folded under legal and regulatory pressure from various governments across the globe and sought to accommodate parochial regulatory requests, even as they expand their efforts internationally. Again, Chander discusses several of the more well-known case studies in the text.

Finally, however, there have been moments when — especially as it pertains to certain free speech matters — some of these corporate players have stood up for a “do no evil” approach when repressive governments come calling.  In this regard, Chander only briefly mentions the work of the Global Network Initiative, which is somewhat surprising since it has been focused on this mission since its inception in 2008. Nonetheless, such “do no evil” moments have happened (for example, Google bowing out of China), although the track record of success here has been spotty to say the least.

Technological Neutrality

Chander also wants to make sure that online markets are not somehow advantaged relative to traditional markets and technologies. “Trade law should not allow countries to insist on a regulatory nirvana in cyberspace unmatched in real space,” he insists. (p. 155)

Fair enough, but how we achieve neutrality and level the proverbial playing field is, of course, important. The problem is that most nation-states seek to harmonize in the direction of greater control. The rise of electronic networks and online commerce presents us with the opportunity to reconsider the wisdom of long-standing statutes and regulations that are either no longer needed or perhaps never should have been on the books in the first place.

This is why I have repeatedly proposed here and elsewhere that, when it comes to domestic information policy spats that involve old and new players and technologies, we should consider borrowing a page from trade law by adopting the equivalent of a “Most Favored Nation” (MFN) clause for communications and media policy. In a nutshell, this policy would state that: “Any operator seeking to offer a new service or entering a new line of business, should be regulated no more stringently than its least regulated competitor.” Such a MFN for communications and media policy would ensure that regulatory parity exists within this arena as the lines between existing technologies and industry sectors continue to blur.

Although it will often be difficult to achieve in practice, the aspirational goal of placing all players and technologies on the same liberalized level playing field should be at the heart of information technology policy to ensure non-discriminatory regulatory treatment of competing providers and technologies.

But let’s be clear about what this means: To level the proverbial playing field properly, I believe we should be “deregulating down” instead of regulating up to place everyone on equal footing. This would achieve technological neutrality through greater technological freedom and marketplace liberalization.

Of course, others (possibly including Chander) would likely claim that could lead to a “race to the bottom” in certain instances by disallowing state action and the application of local laws and norms. But one person’s “race to the bottom” is another person’s race to the top!  It all depends on the perspective you adopt toward liberalization efforts. For me, the more liberalization the better. The history of deregulation has been shown in one market after another to improve consumer welfare by expanding choice, increasing innovation, and generally pushing prices lower.

Policies of Freedom

What other specific policies can help us strike the right balance going forward?

I was extremely pleased to see Chander discuss the Clinton Administration’s July 1997 Framework for Global Electronic Commerce. It was instrumental in setting the right tone for e-commerce policy before the turn of the century. The Framework stressed the importance of taking a general “hands off” approach to these markets and treating the Internet as a global free-trade zone. It set forth five key principles for Net governance, including: “the private sector should lead;” “governments should avoid undue restrictions on electronic commerce;” “where governmental involvement is needed, its aim should be to support and enforce a predictable, minimalist, consistent and simple legal environment for commerce,” and other light-touch policy recommendations.

As I noted in the title of my 2012 Forbes essay on the Framework, “15 Years On, President Clinton’s 5 Principles for Internet Policy Remain the Perfect Paradigm.” Chander generally embraces these principles, too, even though some of his “glocalization” recommendations cut against the grain of this vision.

Importantly, Chander also highlights four specific U.S. policies that have fostered the growth of electronic trade.

  1. “The First Amendment guarantee of freedom of speech;
  2. The Communications Decency Act’s Section 230, granting immunity to web hosts for user-generated information; [see my old Forbes essay, “The Greatest of All Internet Laws Turns 15” for an explanation of why Sec. 230 has been so important.]
  3. Title II of the Digital Millennium Copyright Act (DMCA), granting immunity to web hosts for copyright infringement; and
  4. Weak consumer privacy regulations [which have] created breathing room for the rise of Web 2.0.”

“This permissive legal framework offers the United States as a sort of export-processing zone in which Internet entrepreneurs can experiment and establish services.” (p. 57)  Chander gets it exactly right here. Legally speaking, this is the secret sauce that continues to power the Net.

But Chander doesn’t really fully confront the inherent contradiction in earlier calling for “technological neutrality” between cyberspace and the traditional economy while also praising all these legal policies, which generally treated the Internet in an “exceptionalist” fashion. I would argue that some of that asymmetry was essential, however, not only to allow the Net to get out of its cradle and grow, but also because it taught us how light-touch regulation was generally superior to traditional heavy-handed regulatory paradigms and mechanisms. Now we just need to keep harmonizing in the direction of the greater freedom that the Internet and online markets enjoy.

Multi-stakeholderism?

One surprising thing about Chander’s book is the general absence of the term “multi-stakeholderism.”  It is getting hard to pick up any Internet policy tract these days and not find reference to multi-stakeholder processes of one sort or another. In particular, I expected to see more linkages to broader Net freedom fights involving the U.N. and the WCIT process.

In this sense, it would have been interesting to see Chander bridge the gap between his work here on free trade in information services and the proposals of various Internet governance scholars and advocacy groups. In particular, I would have liked to have heard what Chander thinks about the conflicting Internet policy paradigms set forth in important recent books from Rebecca MacKinnon (“Consent of the Networked”) and Ian Brown and Christopher Marsden (“Regulating Code”) on one hand, versus those of Milton Mueller (“Networks and States”) and David Post (“Jefferson’s Moose”) on the other. I think Chander would generally be more comfortable with the policy paradigms and proposals sketched out by MacKinnon and Brown & Marsden (whereas I am definitely more in league with Mueller and Post), but I’m not entirely sure where he stands.

Regardless, I would have liked to have seen some discussion of these issues in Chander’s otherwise excellent book.

Semantic Choices

I suppose my only other complaint with the book comes down to some semantic issues, beginning with its title.  In some ways, calling it The Electronic Silk Road makes perfect sense since Chander wants us to think of the parallels to the Silk Road of ancient times, of course. Alas, these days it is hard to utter the term “Silk Road” and not think of people buying and selling illegal drugs or other shady stuff in the online black market of the same name. So that will be confusing to some.

I’m also not a big fan of some of the other catch-phrases Chander uses throughout the book. Using the term “net-work,” for example, is a bit too cute for my taste and there are times it gets confusing. And the term “glocalization” is the sort of thing that you’d expect to see on the Fake Jeff Jarvis parody account on Twitter (actually, I think he has used it before) and once critic Evgeny Morozov catches wind of it he will, no doubt, eventually use to linguistically lynch Chander.

Finally, should trade in information and e-commerce be “Trade 2.0” or is it really “Trade 3.0”? To me, Trade 1.0 =agricultural & industrial trade; Trade 2.0 = trade in services; and Trade 3.0 = trade in information and electronic commerce. Doesn’t that make more sense? In any event, the whole 1.0, 2.0, 3.0 thing has gotten a bit clichéd in its own right.

Conclusion

I enjoyed Anupam Chander’s Electronic Silk Road and can recommend it to anyone who is looking to connect the dots between international trade theory and Internet policy / ecommerce developments. The reader will find a little bit of everything in the book, such as classical trade theory from Smith and Ricardo alongside a discussion of Coasean theories of the firm and Benkler-esque theories of commons-based peer production.

Best of all, it is an extremely accessible text such that either a trade policy guru or a Net policy wonk could pick it up and learn a lot about the opposing issues they may not have heard of before. I could also imagine several of the chapters becoming assigned reading in both trade policy courses and cyberlaw programs alike. It’s a supremely balanced treatment of the issues.

]]>
https://techliberation.com/2013/08/24/book-review-anupam-chanders-electronic-silk-road/feed/ 106 73472
Could governments make themselves regulators of content on the new TLDs? https://techliberation.com/2013/08/19/could-governments-make-themselves-regulators-of-content-on-the-new-tlds/ https://techliberation.com/2013/08/19/could-governments-make-themselves-regulators-of-content-on-the-new-tlds/#comments Mon, 19 Aug 2013 20:50:42 +0000 http://techliberation.com/?p=73460

On Sunday, the New York Times ran a story by Natasha Singer on the ongoing generic top-level domain (gTLD) expansion. Singer correctly notes that there is a great deal of skepticism that the new gTLDs will add social value. After all, what is the social value of .book when there is already .book.com?

Singer also raises cultural, expression, and competition concerns:

There’s a larger issue at stake, however. Advocates of Internet freedom contend that such an expanded address system effectively places online control over powerful commercial and cultural interests in the hands of individual companies, challenging the very idea of an open Internet. Existing generic domains, like .net and .com, overseen by Verisign Inc., a domain registry, have an open-use policy; that means consumers can buy domain names ending in .com directly from retail registrars like GoDaddy. With a new crop of applicants, however, Icann initially accepted proposals for closed or restricted generic domains, a practice that could limit competing views and businesses.

It’s true that there is concern over “closed generics,” but I think there is a deeper problem than anti-competitiveness that could emerge from TLD expansion.

Suppose that, as anticipated, TLD registries are able to restrict the scope of the sites that can use their domain name. For example, Google intends to restrict .app to uses related to (Android?) applications. These restrictions could make a great deal of economic sense—owning a .app domain name could function as a certification of a certain level of quality.

Putting aside any anti-competitive concern, restricted TLDs raise the question of who, exactly, is the final arbiter. Let’s suppose that Google rejects an application for a .app domain name for whatever reason. Can the rejected applicant appeal? And to whom?

Google is a Delaware corporation based in California. ICANN is incorporated in California. I can imagine lawsuits in Delaware or California over domain name rejections.

But the scarier possibility is that ICANN will try to resolve these disputes internally, possibly with input from its Governmental Advisory Committee (GAC). This would be problematic because the GAC is not known for its adherence to any sort of rule of law.

If GAC intervention in .app doesn’t worry you, consider the .gay TLD. At least one vision of .gay is as a safe online space for the global gay community. Suppose that the .gay registry, after winning its bid and publicly setting out content guidelines, rejects sites that engage in hate speech against gays. If disputes over such rejections end up in the GAC, then that could be disastrous, as countries like Saudi Arabia and Iran have objected to the mere existence of the .gay TLD.

We can debate whether restricted TLDs should be allowed in the first place, but we should all agree that if they are, the GAC should have no role in policing the content restrictions that registries impose to maximize the value of the namespace. The last thing we need is the world’s governments making policy about expression online.

]]>
https://techliberation.com/2013/08/19/could-governments-make-themselves-regulators-of-content-on-the-new-tlds/feed/ 1 73460
The GAC officially objects to .amazon https://techliberation.com/2013/07/16/the-gac-officially-objects-to-amazon/ https://techliberation.com/2013/07/16/the-gac-officially-objects-to-amazon/#comments Tue, 16 Jul 2013 14:06:33 +0000 http://techliberation.com/?p=45218

ICANN is meeting in Durban, South Africa this week, and this morning, its Governmental Advisory Committee, which goes by the delightfully onomatopoetic acronym GAC, announced its official objection to the .amazon top-level domain name, which was set to go to Amazon, the online purveyor of books and everything else. Domain Incite reports:

The objection came at the behest of Brazil and other Latin American countries that claim rights to Amazon as a geographic term, and follows failed attempts by Amazon to reach agreement.

Brazil was able to achieve consensus in the GAC because the United States, which refused to agree to the objection three months ago in Beijing, had decided to keep mum this time around.

The objection will be forwarded to the ICANN board in the GAC’s Durban communique later in the week, after which the board will have a presumption that the .amazon application should be rejected.

The board could overrule the GAC, but it seems unlikely.

This is a loss for anything resembling rule of law on the Internet. There are rules for applying for new generic TLDs, and the rules specifically say which geographic terms are protected. Basically, anything on this list, known as ISO 3166-1 is verboten. But “Amazon” is not on that list, nor is “Patagonia;” .patagonia was recently withdrawn. Amazon and Patagonia followed the rules and won their respective gTLDs fair and square.

The US’s decision to appease other countries by remaining silent is a mistake. The idea of diplomacy is to get countries to like you so that you can get what you want on policy, not to give up what is right on policy so that other countries will like you. I agree with Milton Mueller, whose bottom line is:

What is at stake here is far more important than the interests of Amazon, Inc. and Patagonia, Inc. What’s really at stake is whether the Internet is free of pointless constraints and petty political objections; whether governments can abuse the ICANN process to create rights and powers for themselves without any international legislative process subject to democratic and judicial checks and balances; whether the alternative governance model that ICANN was supposed to represent is real; whether domain name policy is made through an open, bottom-up consensus or top-down by states; whether the use of words or names on the Internet is subject to arbitrary objections from politicians globalizing their local prejudices.
]]>
https://techliberation.com/2013/07/16/the-gac-officially-objects-to-amazon/feed/ 613 45218
Book Review: Ronald Deibert’s “Black Code: Inside the Battle for Cyberspace” https://techliberation.com/2013/07/16/book-review-ronald-deiberts-black-code-inside-the-battle-for-cyberspace/ https://techliberation.com/2013/07/16/book-review-ronald-deiberts-black-code-inside-the-battle-for-cyberspace/#comments Tue, 16 Jul 2013 13:01:57 +0000 http://techliberation.com/?p=45184

Black Code coverRonald J. Deibert is the director of The Citizen Lab at the University of Toronto’s Munk School of Global Affairs and the author of an important new book, Black Code: Inside the Battle for Cyberspace, an in-depth look at the growing insecurity of the Internet. Specifically, Deibert’s book is a meticulous examination of the “malicious threats that are growing from the inside out” and which “threaten to destroy the fragile ecosystem we have come to take for granted.” (p. 14) It is also a remarkably timely book in light of the recent revelations about NSA surveillance and how it is being facilitated with the assistance of various tech and telecom giants.

The clear and colloquial tone that Deibert employs in the text helps make arcane Internet security issues interesting and accessible. Indeed, some chapters of the book almost feel like they were pulled from the pages of techno-thriller, complete with villainous characters, unexpected plot twists, and shocking conclusions. “Cyber crime has become one of the world’s largest growth businesses,” Deibert notes (p. 144) and his chapters focus on many prominent recent examples, including cyber-crime syndicates like Koobface, government cyber-spying schemes like GhostNet, state-sanctioned sabotage like Stuxnet, and the vexing issue of zero-day exploit sales.

Deibert is uniquely qualified to narrate this tale not just because he is a gifted story-teller but also because he has had a front row seat in the unfolding play that we might refer to as “How Cyberspace Grew Less Secure.” Indeed, he and his colleagues at The Citizen Lab have occasionally been major players in this drama as they have researched and uncovered various online vulnerabilities affecting millions of people across the globe. (I have previously reviewed and showered praise on a couple important books that Deibert co-edited with scholars from The Citizen Lab and Harvard’s Berkman Center, including: Access Controlled: The Shaping of Power, Rights, and Rule in Cyberspace and Access Denied: The Practice and Policy of Global Internet Filtering. They are truly outstanding resources worthy of your attention.)

Black Code’s Many Meanings

So, what is “black code” and why should we be worried about it? Deibert uses the term as a metaphor for many closely related concerns. Most generally it includes “that which is hidden, obscured from the view of the average Internet user.” (p. 6) More concretely, it refers to “the criminal forces that are increasingly insinuating themselves into cyberspace, gradually subverting it from the inside out.” (p. 7) “Those who take advantage of the Internet’s vulnerabilities today are not just juvenile pranksters or frat house brats,” Deibert notes, “they are organized criminal groups, armed militants, and nation states.” (p. 7-8) Which leads to the final way Deibert uses the term “black code.” It also, he says, “refers to the growing influence of national security agencies, and the expanding network of contractors and companies with whom they work.” (p. 8)

Deibert is worried about the way these forces and factors are working together to undermine online stability and security, and even delegitimize liberal democracy itself. His thesis is probably most succinctly captured in this passage from Chapter 7:

We live in an era of unprecedented access to information, and many political parties campaign on platforms of transparency and openness. And yet, at the same time, we are gradually shifting the policing of cyberspace to a dark world largely free from public accountability and independent oversight. In entrusting more and more information to third parties, we are signing away legal protections that should be guaranteed by those who have our data. Perversely, in liberal democratic countries we are lowering the standards around basic rights to privacy just as the center of cyberspace gravity is shifting to less democratic parts of the world. (p. 130-1)

What Deibert is grappling with in this book is the same fundamental problem that has long plagued the Internet: How do you preserve the benefits associated with the most open and interconnected “network of networks” the world has ever known while also remedying the various vulnerabilities and pathologies created by that same openness and interconnectedness?  Deibert acknowledges this problem, noting:

Ever since the Internet emerged from the world of academia into the world of the rest of us, its growth trajectory has been shadowed by a grey economy that thrives on opportunities for enrichment made possible by an open, globally connected infrastructure. (p. 141)

The Paradox of the Net’s Open, Interconnected Nature

Again, paradoxically, this inherent instability and vulnerability is due precisely to the Net’s open and globally interconnected nature. And many governments are looking to exploit that fact. “These unfortunate by-products of an open, dynamic network are exacerbated by increasing assertions of state power,” Deibert notes. (p. 233)

More generally, this uncomfortable fact—that the Net’s open, interconnected nature leads to both enormous benefits as well as huge vulnerabilities—isn’t just true for criminal online activity or the cyber-espionage activities that various nation-states are pursuing today. It is equally true for everything online today. There is a sort of yin and the yang to the Net that is simply undeniable and completely unavoidable. For one issue after another we find that the Net’s greatest blessing—its open, interconnected nature—is also its greatest curse.

For example, as I noted here recently in my review of Abraham H. Foxman and Christopher Wolf ‘s new book, Viral Hate: Containing Its Spread on the Internet, the open and interconnected Internet gives us “the most widely accessible, unrestricted communications platform the world has ever known” but also  means we have to tolerate a great many imbeciles “who use it to spew insulting, vile, and hateful comments.” The same is true for other types of online speech and content: You have access to an abundance of informational riches, but there’s also no avoiding all the garbage out there now, too.

Similarly, as I noted in my essay, “Privacy as an Information Control Regime: The Challenges Ahead,” the open and interconnected Internet has given us historically unparalleled platforms for social interaction and commerce. But that same openness and interconnectedness has left us with a world of hyper-exposure and a variety of privacy and surveillance threats—not just from governments and large corporations, but also from each other.

And then there’s the never-ending story of digital copyright. On one hand, the open and globally interconnected network or networks has provided us with an amazing platform for sharing knowledge, art, and expression. On the other hand, as I noted in this essay on “The Twilight of Copyright,” creators of expressive works have less security than ever before in terms of how they can control and monetize their artistic and scientific inventions.

I could go on and on—as I did in my essays on “Copyright, Privacy, Property Rights & Information Control: Common Themes, Common Challenges” and “When It Comes to Information Control, Everybody Has a Pet Issue & Everyone Will Be Disappointed”—but the moral of the story is pretty clear: The Internet giveth and the Internet taketh away. Openness and interconnectedness offer us enormous benefits but also force us to confront major risks as the price of admission to this wonderful network.

Will the Whole System Collapse?

The uncomfortable question that Deibert’s book tees up for discussion is: When will this balance get completely out of whack in terms of online security? Or, has it already? In some portions of the text, he hints that may already be the case. Consider this passage in Chapter 11 in which Deibert discusses whether the Chicken Little-ism of digital security worry-warts like Eugene Kaspersky and Richard Clarke is warranted:

Eugene Kaspersky, Richard Clarke, and others may sound like broken records or self-serving fear mongers, but there is no denying the evolving cyberspace ecosystem around us: we are building a digital edifice for the entire planet, and it sits above us like a house of cards. We are wrapping ourselves in expanding layers of digital instructions, protocols, and authentication mechanisms, some them open scrutinized, and regulated, but many closed, amorphous, and poised for abuse, buried in the black arts of espionage, intelligence gathering, and cyber and military affairs. Is it only a matter of time before the whole system collapses? (p. 186)

That sounds horrific, but is it really the case that the entire system really about to collapse? And, if so, what are we going to do about it?

This raises a small problem with Deibert’s book. He does such a nice job itemizing and describing these security vulnerabilities that by the time the reader wades through 230 pages and nears the end of the book, they are left in a highly demoralized state, searching for some hope and a concrete set of practical solutions. Unfortunately, they won’t find an abundance of either in Deibert’s brief closing chapter, “Toward Distributed Security and Stewardship in Cyberspace.”

Don’t get me wrong; I agree with the general thrust of Deibert’s framework, which I describe below. The problem is that it is highly aspirational in nature and lacks specifics. Perhaps that is simply because there are no easy answers here. Digital security is damn hard and, as with most other online pathologies out there, no silver-bullet solutions exist.

Deibert notes that some government officials will seek to exploit those vulnerabilities—many of which they created themselves—to expand their authority over the Internet. “Faced with mounting problems and pressures to do something, too many policy-makers are tempted by extreme solutions,” he notes. (p. 234) He worries about “a movement towards clamp down” that would be “antithetical to the principles of liberal democratic government” by undermining checks and balances and accountability. (p. 235) In turn, this will undermine the “mixed common-pool resource” that is the current Internet.

Deibert’s alternative cyber security strategy to counter the push to “clamp down” is based on three interrelated notions or components:

  1. Principles of restraint or “mutual restraint”: “Securing cyberspace requires a reinforcement, rather than a relaxation, of restraint on power, including checks and balances on governments, law enforcement, intelligence agencies, and on the private sector,” he argues. (p. 239)
  2. “Distributed security”: “The Internet functions precisely because of the absence of centralized control, because of thousands of loosely coordinated monitoring mechanisms,” Deibert notes. “While these decentralized mechanisms are not perfect and can occasionally fail, they form the basis of a coherent distributed security strategy. Bottom-up, ‘grassroots’ solutions to the Internet’s security problems are consistent with principles of openness, avoid heavy-handedness, and provide checks and balances against the concentrations of power,” he observes. (p. 240)
  3. “Stewardship” which Deibert defines as “an ethic of responsible behavior in regard to shared resources” and which, he argues, “would moderate the dangerously escalating exercise of state power in cyberspace by defining limits and setting thresholds of accountability and mutual restraint.” (p. 243)

Again, as an aspirational vision statement this all generally sounds fairly sensible, but the details are lacking. I think Deibert would have been wise to spend a bit more time developing this alternative “bottom-up” vision of how online security should work and bolstering it with case studies.

Digital Security without Top-Down Controls

Luckily, as my Mercatus Center colleague Eli Dourado noted in an important June 2012 white paper, distributed security and stewardship strategies are already working reasonably well today. Dourado’s paper, “Internet Security Without Law: How Service Providers Create Order Online,” documented the many informal institutions that enforce network security norms on the Internet and shows how cooperation among a remarkably varied set of actors improves online security without extensive regulation or punishing legal liability. “These informal institutions carry out the functions of a formal legal system—they establish and enforce rules for the prevention, punishment, and redress of cybersecurity-related harms,” Dourado noted.

For example, a diverse array of computer security incident response teams (CSIRTs) operates around the globe and share their research and coordinate their responses to viruses and other online attacks. Individual Internet service providers (ISPs), domain name registrars, and hosting companies, work with these CSIRTs and other individuals and organizations to address security vulnerabilities. A growing market for private security consultants and software providers also competes to offer increasingly sophisticated suites of security products for businesses, households, and governments.

A great deal of security knowledge is also “crowd-sourced” today via online discussion forums and security blogs that feature contributions from experts and average users alike. University-based computer science and cyberlaw centers (like Citizen Lab) and experts have also helped by creating projects like “Stop Badware,” which originated at Harvard University but then grew into a broader non-profit organization with diverse financial support.

Dourado continues on in his paper to show how these informal, bottom-up efforts to coordinate security responses offer several advantages over top-down government solutions, such as administrative regulation or punishing liability regimes.

Dourado’s description of the ideal approach to online security is entirely consistent with Deibert’s vision in Black Code. In fact, Deibert notes, “It is important to remind ourselves that in spite of the threats, cyberspace runs well and largely without persistent disruption. On a technical level, this efficiency is founded on open and distributed networks of local engineers who share information as peers,” he observes. (p. 240) That is exactly right, but I wish Deibert would have spent more time discussing how this system works in practice today and how it can be tweaked and improved to head off the heavy-handed and very costly top-down solutions that we both dread.

Toward Resiliency

But there’s one other thing I wish Deibert would have explored in the book: resiliency, or how we have adapted to various cyber-vulnerabilities over time.

For example, in another recent Mercatus Center study entitled “Beyond Cyber Doom: Cyber Attack Scenarios and the Evidence of History,” Sean Lawson, an assistant professor in the Department of Communication at the University of Utah, has stressed the importance of resiliency as it pertains to cybersecurity and concerns about “cyberwar.” “Research by historians of technology, military historians, and disaster sociologists has shown consistently that modern technological and social systems are more resilient than military and disaster planners often assume,” he writes. “Just as more resilient technological systems can better respond in the event of failure, so too are strong social systems better able to respond in the event of disaster of any type.”

More generally, as I noted in my recent law review article on “technopanics” and “threat inflation” in information technology policy debates:

while it is certainly true that “more could be done” to secure networks and critical systems, panic is unwarranted because much is already being done to harden systems and educate the public about risks. Various digital attacks will continue, but consumers, companies, and others organizations are learning to cope and become more resilient in the face of those threats.

What Professor Lawson and I are getting at in our respective articles is that the ability of organizations, institutions, and individuals to bounce back from adversity is a frequently unheralded feature of various systems and that it deserves more serious study. (See Andrew Zolli and Ann Marie Healy’s nice book, Resilience: Why Things Bounce Back, for more on this general topic). In the context of online security, what is most remarkable to me is not that the Internet suffers from vulnerabilities due to its open and interconnected nature; it’s that we don’t suffer far more damage as a result.

This gets us back to that very profound question that Deibert poses in Black Code: “Is it only a matter of time before the whole system collapses?” The better question, I think, is: why hasn’t the system already collapsed? Perhaps the answer is, because things haven’t gotten bad enough yet. But I believe that the more realistic answer is that: individuals and institutions often learn how to cope and become resilient in the face of adversity. This is partially the case online because of the stewardship and distributed, decentralized security we already see at work today that makes digital life tolerable.

But it has to be something more than that. After all, many of the security problems that Deibert describes in his book are quite serious and already affect millions of us today. How, then, are we getting by right now? Again, I think the answer has to be that adaptation and resiliency are at work on many different levels of online life.

Consider, for example, how we have learned to deal with spam, viruses, online porn, various online advertising and privacy concerns, and so on. Our adaptation to these threats and annoyances has not been perfectly smooth, of course. No doubt, some people would still like “something to be done” about these things. But isn’t it remarkable how we have, nonetheless, carried on with online commerce and interactive social life even as these problems have persisted?

Conclusion

Going forward, therefore, perhaps there are some reasons for hope. Perhaps the various generic strategies that Deibert outlines in his book, coupled with the remarkable ability of humans to roll with the punches and adapt, will help us come out of this just fine (or at least reasonably well).

Of course, it could also be the case that these security concerns just multiply and that the Internet then morphs into sometime quite different than the interconnected “network of networks” we know today. As I noted in my 2009 essay on “Internet Security Concerns, Online Anonymity, and Splinternets,” we might be moving toward a world with more separate dis­connected digital networks and online “gated communities.” This could take place spontaneously over time and be driven by corporations seeking to satisfy the demand of some consumers for safer and more secure online experiences. As I noted in my review of Jonathan Zittrain’s book, The Future of the Internet, I am actually fine with some of that. I think we can live in a hybrid world of “walled gardens” alongside of the “Wild West” open Internet, so long as this occurs in a spontaneous, organic, bottom-up fashion. [For a more extensive discussion, see my book chapter, “The Case for Internet Optimism, Part 2 – Saving the Net From Its Supporters.”]

If, however, this “splintering” of the Net is done from the top-down through intentional (or even incidental) government action, then it is far more problematic. We already see signs, for example, that Russia is pushing even more strongly in that direction in the wake of the NSA leaks. (See “N.S.A. Leaks Revive Push in Russia to Control Net,” New York Times, July 14.) The Russians have been using amorphous security concerns to push for greater Internet control for some time now. Of course, China has been there for years. So have many Middle Eastern countries. Of course, there’s no guarantee that their respective “splinternets” are, or would be, any more secure than today’s Internet, but it sure would make those networks far more susceptible to state control and surveillance. If that’s our future, then it certainly is a dismal one.

Anyway, read Ron Deibert’s Black Code for an interesting exploration of these and other issues. It’s an excellent contribution to field of Internet policy studies and a book that I’ll be recommending to others for many years to come.


Additional resources:

Other books you should read alongside “Black Code” (links are for my reviews of each book):

]]>
https://techliberation.com/2013/07/16/book-review-ronald-deiberts-black-code-inside-the-battle-for-cyberspace/feed/ 2 45184
The NSA is screwing us on Internet governance https://techliberation.com/2013/07/15/the-nsa-is-screwing-us-on-internet-governance/ https://techliberation.com/2013/07/15/the-nsa-is-screwing-us-on-internet-governance/#comments Mon, 15 Jul 2013 14:35:13 +0000 http://techliberation.com/?p=45181

The New York Times reports:

The Russians, who with only minimal success, had for years sought to make these companies provide law enforcement access to data within Russia, reacted angrily. Mr. Gattarov formed an ad hoc committee in response to Mr. Snowden’s leaks.

Ostensibly with the goal of safeguarding Russian citizens’ private lives and letters from spying, the committee revived a long-simmering Russian initiative to transfer control of Internet technical standards and domain name assignments from two nongovernmental groups that control them today to an arm of the United Nations, the International Telecommunications [sic] Union.

It’s not immediately clear to me how moving Internet standards and DNS from IETF and ICANN to the ITU is supposed to stop the NSA from spying on Russians, so the smart read is that this is retaliation pure and simple.

Brazil’s foreign minister, Antonio Patriota, for example, a week ago endorsed the Russian proposal to transfer some control over Internet technical standards to the United Nations telecommunications agency.

While these are not major changes in policy positions, the NSA’s surveillance programs seem to be galvanizing those who want the ITU to take an active role in Internet governance. It’s time for the USA to practice what it preaches on Internet freedom.

]]>
https://techliberation.com/2013/07/15/the-nsa-is-screwing-us-on-internet-governance/feed/ 11 45181
Richard Brandt on Jeff Bezos and amazon.com https://techliberation.com/2013/06/25/richard-brandt/ https://techliberation.com/2013/06/25/richard-brandt/#respond Tue, 25 Jun 2013 10:00:04 +0000 http://techliberation.com/?p=45008

Richard Brandt, technology journalist and author, discusses his new book, One Click: Jeff Bezos and the Rise of Amazon.Com. Brandt discusses Bezos’ entrepreneurial drive, his business philosophy, and how he’s grown Amazon to become the biggest retailer in the world. This episode also covers the biggest mistake Bezos ever made, how Amazon uses patent laws to its advantage, whether Amazon will soon become a publishing house, Bezos’ idea for privately-funded space exploration and his plan to revolutionize technology with quantum computing.

Download

Related Links

 

 

]]>
https://techliberation.com/2013/06/25/richard-brandt/feed/ 0 45008
What to expect at the WTPF https://techliberation.com/2013/05/06/what-to-expect-at-the-wtpf/ https://techliberation.com/2013/05/06/what-to-expect-at-the-wtpf/#respond Mon, 06 May 2013 13:33:38 +0000 http://techliberation.com/?p=44646

Next week, I’ll be in Geneva for the 2013 World Telecommunication/ICT Policy Forum, better known by the acronym WTPF-13. This is the first major ITU conference since the WCIT in December, and the first real test of whether what some are calling the “post-WCIT era” really exists, and if so, what it means. For those just now tuning in, the WCIT was a treaty conference in Dubai in which some ITU member states pushed hard to make elements of the Internet subject to intergovernmental agreement, resulting in the refusal of 55 countries to sign the treaty. I published a retrospective account of my experience at the WCIT at Ars Technica.

The WTPF will be different than the WCIT in several important ways:

  • It’s not a treaty conference. The output of the meeting is instead a report and several opinions. Draft text of these have been negotiated over three preparatory meetings of an “Informal Experts Group” (IEG). The WTPF will finalize the text, which is non-binding, but is likely to be selectively quoted at future treaty conferences in order to pursue the agenda of each member state.
  • Sector members can participate. The ITU is an intergovernmental organization, and member states are its primary constituency. However, the ITU also allows for “sector members,” which are mostly corporations that are involved in international telecommunications. Sector members will have microphones and be able to address the chair during the WTPF, something they could not do during the WCIT. It has not yet been made conclusively clear to me whether sector members will be able to formally vote, if a formal vote is held. (Secretary-General Hamadoun Touré said there would be no voting at the WCIT, but both informal and formal votes were held.)
  • The Internet is explicitly on the table. The Secretariat promised that Internet governance would not be considered at the WCIT, but it ultimately was, which is one reason that the conference failed to produce a treaty that all countries could feel comfortable signing. But the official theme of the WTPF is “international Internet-related public policy matters,” so there is widespread agreement that the Internet is a suitable topic of discussion at the WTPF, even if there is little agreement on conclusions.
  • Anybody can download and read the official WTPF documents. Before and during the WCIT, working drafts and member state contributions were kept secret. Jerry Brito and I started WCITLeaks in order to give the general public access to these documents. For whatever reason—whether exposure of the lack of transparency in the WCIT process embarrassed the ITU Secretariat, or they were planning to make the WTPF more open anyway—all WTPF documents are available for your perusal, several in all six official ITU languages. Either way, I’m happy to applaud the decision to make the documents available.
  • The WTPF is only three days long. The WCIT was almost two weeks. This imposes significant limitations on the amount of deliberation that can occur. There is also a WTPF every 4 years, whereas a WCIT happens only on an as-demanded basis.

Since the conference is going to be short, I expect that most of the debate will focus on the six draft opinions that have been attached to the Secretary-General’s report. The report itself is probably too long to receive substantial revision in only three days. Consequently, the opinions are likely to be where the action is. The draft opinions are:

  1. Promoting Internet Exchange Points (IXPs) as a long term solution to advance connectivity
  2. Fostering an enabling environment for the greater growth and development of broadband connectivity
  3. Supporting Capacity Building for the deployment of IPv6
  4. In Support of IPv6 Adoption and transition from IPv4
  5. Supporting Multi-stakeholderism in Internet Governance
  6. On supporting operationalizing the Enhanced Cooperation Process

Opinions 1 and 2 will be consider in Working Group 1, 3 and 4 will be considered in Working Group 2, and 5 and 6 will be considered in Working Group 3.

The United States has expressed qualified support for the current draft text of all six opinions in its contribution to the WTPF:

The United States is prepared to endorse the consensus achieved by the IEG and adopt the six non-binding opinions as presented in the annex to the Secretary General’s report. We take this approach based on our desire for a successful forum, despite some concerns with respect to the opinions on multi-stakeholderism and enhanced cooperation. But we recognize, as we hope all participants do, that to attempt to renegotiate the text or introduce new topics or opinions during this meeting would cause significant difficulties and upset the consensus already achieved.

Nevertheless, other countries have proposed substantial changes to the draft IEG text. Perhaps the most controversial opinion is number 5 on multi-stakeholderism. Multi-stakeholderism is a tricky element of international Internet politics. Most participants have agreed at one point or another that the “multi-stakeholder” institutions that currently govern the Internet are an important part of the Internet’s success. However, this has led the more authoritarian countries to insist that governments are stakeholders too, and it has led those who support greater ITU involvement in international Internet policy to insist that the ITU is a multi-stakeholder organization.

For example, in a speech two weeks ago in Brussels, Secretary-General Touré said:

This opinion reiterates what I have been saying for some time—that the ITU has been multi-stakeholder from its inception, and that it was the success of the multi-stakeholder approach within ITU that inspired the multi-stakeholder principles agreed at the ITU-led World Summit on the Information Society, WSIS.

Now, Opinion 5 does  not say that the ITU is a multi-stakeholder organization (read it yourself), and the ITU is certainly not and has never been a multi-stakeholder institution, unless “multi-stakeholder” is defined as simply having multiple stakeholders. Among those who originally advocated multi-stakeholderism, the term connotes a certain bottom-up, voluntary, inclusive, and even informal process, which is incompatible with intergovernmentalism. This…loose talk…by the Secretary-General appears to be intended to position the ITU to take a more active role in Internet governance. Some member states share Dr. Touré’s apparent agenda. For example, Brazil’s proposed replacement for Opinion 5 explicitly says, “ITU is a multistakeholder organization.”

Russia’s proposed edits to Opinion 5 focus much less on the ITU itself and more on the role of government. For instance, it invites member states:

to exercise their rights on Internet Governance to control distribution, appropriation and development of Internet numbering, naming, addressing and identification resources and support the operation and development of the basic information and communication infrastructure, include the Internet, at the national level.

In other words, Russia wants to supplant existing Internet governance structures with national laws.

Aside from Opinion 5, the other major issue I am keeping my eye on is Working Group 2 on IP addresses and the IPv6 transition. Late last week, there was an unexpected shuffling of Working Group chairs. The chairwoman of WG3 was removed, the chairman of WG2 was moved to WG3, and Musab Abdullah from Bahrain was announced as the new chairman of WG2.

Those of us who were at the WCIT remember Mr. Abdullah as a forceful advocate for measures, like calling party identification and government-managed naming and numbering resources, that would have enabled greater government control of telecommunication services. And Bahrain is one of the most repressive regimes with respect to the Internet in the world. Reporters Without Borders considers Bahrain one of only five “state enemies of the Internet” in 2013.

So why did this shakeup of Working Group chairs happen, and why is one of the world’s top censors now chairing the Working Group on IP addressing? Could there be a strong push in favor of an expansive role for governments in assigning IP addresses, one that would allow governments to more easily link IP addresses to individuals in order to support censorship? We’ll find out next Wednesday morning when WG2 convenes.

For updates during the WTPF, follow me on Twitter. As always, any views expressed in this post or in future posts and tweets are my own, and should not be attributed to any government or delegation.

]]>
https://techliberation.com/2013/05/06/what-to-expect-at-the-wtpf/feed/ 0 44646
WTF? WTPF! The continuing battle over Internet governance principles https://techliberation.com/2013/04/23/wtf-wtpf-the-continuing-battle-over-internet-governance-principles/ https://techliberation.com/2013/04/23/wtf-wtpf-the-continuing-battle-over-internet-governance-principles/#respond Tue, 23 Apr 2013 19:59:01 +0000 http://techliberation.com/?p=44579

Remember all the businesses, internet techies and NGOs who were screaming about an “ITU takeover of the Internet” a year ago? Where are they now? Because this time, we actually need them.

May 14 – 21 is Internet governance week in Geneva. We have declared it so because there will be three events in that week for the global community concerned with global internet governance. From 14-16 May the International Telecommunication Union (ITU) holds its World Telecommunication Policy Forum (WTPF). This year it is devoted to internet policy issues. With the polarizing results of the Dubai World Conference on International Telecommunications (WCIT) still reverberating, the meeting will revisit debates about the role of states in Internet governance. Next, on May 17 and 18, the Graduate Institute of International and Development Studies and the Global Internet Governance Academic Network (GigaNet) will hold an international workshop on The Global Governance of the Internet: Intergovernmentalism, Multi-stakeholderism and Networks. Here, academics and practitioners will engage in what should be a more intellectually substantive debate on modes and principles of global Internet governance.

Last but not least, the UN Internet Governance Forum will hold its semi-annual consultations to prepare the program and agenda for its next meeting in Bali, Indonesia. The IGF consultations are relevant because, to put it bluntly, it is the failure of the IGF to bring governments, the private sector and civil society together in a commonly agreed platform for policy development that is partly responsible for the continued tension between multistakeholder and intergovernmental institutions. Whether the IGF can get its act together and become more relevant is one of the key issues going forward.

Internet Governance Principles

The Dubai WCIT meeting last year grafted an Internet governance principles debate onto negotiations over an old telecommunications treaty that had little to do with the internet. That muddled the debate considerably. This time, we are actually having a debate about Internet governance principles, specifically the role of states and intergovernmental institutions.

In preparation for the WTPF, The ITU’s Secretary-General has released a 38-page report and five “Draft Opinions” on policy. The stated aim of the WTPF report is “to provide a basis for discussion at the Policy Forum…focusing on key issues on which it would be desirable to reach conclusions.” This is what the IGF ought to be doing but was prevented from doing by key stakeholders in the Internet technical and business communities, because they wanted to make sure the IGF could not be used to challenge the status quo.

The ITU SG’s report contains a fairly balanced survey of many internet-related policy controversies. After digesting it, however, it becomes clear that its main purpose is to re-assert and strengthen the role of governments in Internet governance. In particular, it proposes a definition of multi-stakeholderism that reserves to states a ‘sovereign right’ to make ‘public policy for the Internet;’ a definition that relegates the private sector and civil society to secondary, subordinate roles rather than empowering them as equal-status participants in new institutions for Internet governance. In keeping with this philosophy, the discussions at WTPF will be confined to ITU member states and sector members. Ordinary citizens cannot speak, they can only watch.

A flawed debate

What’s troubling about this looming debate is the intellectual weakness of so many of the supposed defenders of internet freedom. The Internet Society, ICANN and the U.S. government have increasingly re-branded Internet freedom as “The Multistakeholder Model” (TMM). So the choice we are given is not between a free Internet and a restricted, censored one, or between centralized, hierarchical internet governance and a more distributed, participatory, open and decentralized governance. No, we are given a choice between the ITU and a status quo that is vaguely defined as TMM. This not only implies that there is a single, well-defined “Multistakeholder Model” (in fact, there is not), but it conflates the results of good governance (freedom, openness, innovation, globalized connectivity, widespread access) with a particular model. It also tends to exempt many of the existing Internet governance institutions from deserved criticism and reform.

The lack of intellectual substance underlying the principles debate was played out with stark clarity in the U.S. two weeks ago, when the U.S. Congress proposed a bill “to Affirm the Policy of the United States Regarding Internet Governance.” The bill originally said

“It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.”

For reasons that we outlined in an earlier blog, the “government control” language was deemed too controversial and the bill was amended to read:

“It is the policy of the United States to preserve and advance the successful multistakeholder model that governs the Internet.”

So the United States has officially refused to endorse freedom from government control as a policy underlying its approach to Internet governance. It does not, apparently, have any principled objection to censorship, state surveillance to facilitate political manipulation of the population, over-regulation, over-taxation, economic protectionism and other destructive forms of governmental intervention. All those things are fine, apparently, as long as we manage to “preserve and advance” multistakeholder governance. What an uninspiring stance!

Why should anyone support TMM if it is devoid of any substantive meaning regarding the role of states and freedom from governmental control? TMM inspires support only if it is presented as a better alternative to a form of governance that is authoritarian, repressive, ineffective and unrepresentative of Internet users’ interests. In other words, we should support TMM only insofar as it contains and limits the power of nation-states to interfere unduly with the use and operation of the Internet, and empowers individuals worldwide to govern themselves. TMM is not an end in itself. In fact, once it is stripped of substantive policy norms, dogmatic support for TMM seems indistinguishable from unqualified support for existing Internet institutions.

As we enter into this crucial debate about principles of Internet governance, we need to have a better understanding of why global Internet governance institutions need to be shielded from national governments. Below we provide some simple bullet points as a guide to the ongoing debate over principles regarding the role of states in Internet governance.

  • The political unit – the polity – for Internet governance should be the transnational community of Internet users and suppliers, not a collection of states.

There is a fundamental, lasting conflict between territorial jurisdiction and the global Internet. There is a fundamental difference between a collection of leaders of national polities and a global polity. Though national governments can provide legitimate and rights-respecting modes of ordering society within their jurisdiction, at the transnational level there is anarchy, a space where the problems of governance are best addressed by new institutions with direct participation and more open channels of communication. National governments are not ‘just another stakeholder’ in a multistakeholder system: they represent a competing, alternative institutional framework.

  • A system of Internet governance based on states is inherently biased toward greater restriction and control of the Internet’s capabilities.

States are by nature oriented toward control. More specifically, they are concerned about maintaining their own control qua sovereign entity in a territory. They will, therefore, act to limit forms of choice and access that provide alternatives to their control of communication and information. In the international arena they will bargain with other states to maintain their security and control in relation to other states. They will not be optimal representatives of the interests of Internet users in freedom, access and openness. Ever.

  • The threats to Internet freedom posed by states are more serious than those posed by private actors.

Get over the stuff about ‘Googledom’ and ‘Facebookistan.’ It’s a cute metaphor but there is really no comparison between sovereigns and these businesses. States have the power to tax and expropriate, they have a monopoly on the use of force, they generate armed conflicts that result in war; they fund and deploy weapons. You do not choose to use their services. However much you might think you are locked in to Google, there is still a huge qualitative difference between your ability to use or not use its services and the choice you have with respect to states. This doesn’t mean that the private sector is perfect nor that there is no need for states to ever order or regulate what private actors do, but it helps to keep your priorities straight.

  • Multi-stakeholderism is not a panacea

Multistakeholderism as an ideology originated as a pragmatic means of opening up intergovernmental organizations (IGOs) to broader representation and participation. As a transitional mechanism for infusing IGOs with more information, expertise and voice, it has worked wonderfully. But it is not a well-defined, ultimate solution to the problem of Internet governance. The organically evolved Internet institutions were not originally conceived as “multistakeholder” but as private sector and contractually based governance. Some forms of Internet governance, such as the IETF, are truly bottom up, based on individualized representation, decentralized and largely voluntary in effect. Others, like ICANN, are highly centralized, largely coercive, and deeply enmeshed with states in a hybrid form of global governance. The virtues (and faults) of one should not be visited upon the other.

]]>
https://techliberation.com/2013/04/23/wtf-wtpf-the-continuing-battle-over-internet-governance-principles/feed/ 0 44579
An Internet ‘free from Government Control’ A worthy principle https://techliberation.com/2013/04/15/an-internet-free-from-government-control-a-worthy-principle/ https://techliberation.com/2013/04/15/an-internet-free-from-government-control-a-worthy-principle/#respond Mon, 15 Apr 2013 13:10:13 +0000 http://techliberation.com/?p=44513

On Wednesday, April 10, a bill “to Affirm the Policy of the United States Regarding Internet Governance” was marked up in the U.S. House of Representatives. The bill is an attempt to put a formal policy statement into statute law. The effective part says simply:

It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.

Yet this attempt to formulate a clear principle and make it legally binding policy has become controversial. This has happened because the bill brings to a head the latent contradictions and elisions that characterize U.S. international Internet policy. In the process it has driven a wedge between what was once a unified front by U.S. Democrats and Republicans against incursions into Internet governance by intergovernmental organizations such as the ITU.

The problem, it seems, is that the Democratic side of the aisle can’t bring itself to say that it is against ‘government control’ per se. Indeed, the bill has forced people linked to the Obama administration to come out and openly admit that ‘government control’ of the internet is OK when we exercise it; it’s just those other countries and international organizations that we need to worry about.

The U.S. has been deeply enmeshed in this contradiction ever since the World Summit on the Information Society in 2003-5, when it fended off criticisms of the U.S.-controlled ICANN while claiming to oppose ‘government control.’ In the meantime various US government agencies have (largely unconscious of or independently of the Internet freedom rhetoric) cast global shadows of hierarchy over various aspects of the Internet, seeking extraterritorial domain name takedowns, ACTA, restricted online gambling, cyber-weapons, and so on.

Until now, the contradiction has remained latent, a sotto voce muttering that the emperor has no clothes. Only a few hyper-critical academics (like us) were willing to articulate the argument, generally irritating everyone in the process. But now it’s out in the open. The double standard is humorously evident in this video showing the testimony of Rep. Eshoo, a Democrat of California, in the markup hearings. Rep. Eshoo says:

“…the expert agencies have expressed concern with the term, quote, ‘government control,’ unquote. One diplomat suggested that the use of his term could actually undermine existing Internet governance institutions such as ICANN because of its, uh, uh, close relationship with, uh, our government. Foreign countries frequently cite the close coordination between ICANN and US Dept of Commerce as an example of US quote ‘control’ over the internet.”

Well, yes, Rep. Eshoo, other countries do look at ICANN as a form of global Internet control exercised by one government. Are they wrong? ICANN gets its policy making authority over the DNS root directly from a contract with the U.S. government, and in exchange for receiving that contract ICANN has to stay in the U.S. and conform to various policies. This is not ‘close coordination;’ it’s control. Not even the slipperiest politician can plausibly deny this.

A similar double standard was raised in the response of Public Knowledge (PK), a U.S. public interest group. PK happily collected grants to join the U.S.-led charge against ‘government control of the Internet’ in the renegotiation of the ITU’s International Telecommunication Regulations. It joined in the anti-government rhetoric about how the Internet had to be left alone. Now it wants to clarify its position a bit:

we fear that the broad language of the proposed bill may intrude on areas of consumer protection, competition policy, law enforcement and cybersecurity long considered appropriate for national policy formulated by governments with input from civil society, business and the technical community.

Like Rep. Eshoo, PK is forced to distinguish between government control at home (the good kind) and government control that involves the rest of the world (the scary kind). Note that PK also tacitly accepts the description of different roles for government and civil society that the authoritarian states put into the WSIS Tunis Agenda: governments formulate policy and the rest of us just provide input.

Remember, at the end of the WCIT negotiations we were being told that an indirect reference to spam (“unsolicited bulk electronic communications”) in the ITRs opened the door to systematic content regulation on a global basis. Now PK is forced to admit that:

Although we opposed the ITU resolution to require countries to limit spam, the United States protects its citizens from spam through the CAN-SPAM Act.

Indeed. And why are domestic spam laws fine and international ones (that would have to be enforced by and consistent with those same domestic laws, and ratified by the same national legislature that passed the domestic laws) a threat to the very basis of free expression? According to PK,

Our opposition to ceding authority to the ITU to decide how to balance consumer protection and free expression is not because we see no role for government in protecting consumers or promoting competition. Rather, we believe those matters are best decided here at home, by a Congress accountable to the people and enforced by a government constrained by the Constitution.

So has PK gone cyber-nationalist? Like the Chinese, the Russians, the Saudis and the Iranians, does it want a balkanized Internet governed by a separate and distinct series of national sovereigns? If so, what, exactly, is wrong with the ITU as a venue for negotiating governance? The ITU is a global governance institution founded on the principles of national sovereignty.

We think its high time to call the bluff of American politicians and advocacy groups that play with this double standard. If they cannot bring themselves to embrace a principle of “a global Internet free from government control” it’s time to ask them what they do stand for.

Defending the legitimate rights of consumers to be protected against fraud or monopolies is not “government control” of the Internet, by any serious definition. By protecting individual rights to privacy, by challenging coercive and collusive monopolies and by prosecuting fraud, governments are maintaining individual freedom, not exerting control. It is worrisome, therefore, that allegedly liberal groups such as PK want to maintain an option for ‘government control’ at the level of broad principle.

The PK’s reversion to cybernationalism is both intellectually flawed and politically disturbing. Their attempt to distinguish between national laws and international ones falls apart completely when examined. Laws that overreach and over-regulate occur in both levels; PK simultaneously underestimates the dangers of government control at home (which is odd, given its involvement in issues such as CISPA) and overstates the dangers of international laws (which typically have to be ratified domestically and are subject to reservations).

Whether you are talking about China, Russia or the USA, you can’t have a free Internet and a national Internet. As a virtual space constructed out of a globally interconnected infrastructure, cyberspace realizes its highest potential when it is not artificially bounded by jurisdiction or hierarchically imposed filters. Right now, the biggest threats to internet freedom are from national governments. And while there are indeed aspects of communications that can and should be left to domestic regulation, any regulation that is too scary to be implemented at the international level probably poses many of the same dangers when enacted at the national level. The idea that we only have to worry about ‘government control’ when we are talking about foreign governments is obviously wrong.

The House bill articulates a worthy principle that can be and should be globally applicable to the Internet. Not controlling the Internet does not mean that there is no role for laws or regulations that safeguard individual rights; it means that national governments should recognize the Internet’s transnational nature and refrain from trying to suppress the rights to free expression and free association that have emerged in the context of a decentralized Internet not under the control of any sovereign.

]]>
https://techliberation.com/2013/04/15/an-internet-free-from-government-control-a-worthy-principle/feed/ 0 44513
Does CDT believe in Internet freedom? https://techliberation.com/2013/04/11/does-cdt-believe-in-internet-freedom/ https://techliberation.com/2013/04/11/does-cdt-believe-in-internet-freedom/#comments Thu, 11 Apr 2013 14:38:40 +0000 http://techliberation.com/?p=44480

Last year, in advance of the World Conference on International Telecommunication, Congress passed a concurrent resolution stating its sense that US officials should promote and articulate the clear and unequivocal “policy of the United States to promote a global Internet free from government control and preserve and advance the successful multistakeholder model that governs the Internet today.” This language sailed through the House on a bipartisan basis with broad support from basically everyone in US civil society.

Now that WCIT is over, and the World Telecommunication/ICT Policy Forum looms, Congress is considering a law that reads:

It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.

And suddenly it’s controversial. Democrats are concerned that language about freedom “from government control” would apply to—gasp—the US government.

As Rep. Walden says,

Last Congress, we “talked the talk” and passed a resolution defending a global Internet free from government control. This Congress we must “walk the walk” and make it official U.S. policy. If this is a principle we truly believe in, there is no downside to stating so plainly in U.S. law.

I could not agree more.

I am especially disappointed by our friends at CDT. They are coming out against the bill, with both blog post and letter barrels blazing, after having supported the exact same language last year. Apparently, in CDT’s world: US government regulation of the Internet good, foreign government regulation of the Internet bad.

This episode shows the prescience of my colleagues Jerry Brito and Adam Thierer. As they wrote last year when Congress was considering the joint resolution:

The most serious threat to Internet freedom is not the hypothetical specter of United Nations control, but the very real creeping cyber-statism at work in the legislatures of the United States and other nations.

CDT gets this exactly backwards. Here’s hoping they change their minds yet again.

]]>
https://techliberation.com/2013/04/11/does-cdt-believe-in-internet-freedom/feed/ 4 44480
What’s Wrong with Intergovernmentalism? https://techliberation.com/2013/04/09/whats-wrong-with-intergovernmentalism/ https://techliberation.com/2013/04/09/whats-wrong-with-intergovernmentalism/#comments Tue, 09 Apr 2013 13:37:44 +0000 http://techliberation.com/?p=44459

People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public. — Adam Smith, The Wealth of Nations

As we approach the World Telecommunication/ICT Policy Forum, the debate over whether intergovernmental organizations like the International Telecommunication Union should have a role to play in Internet governance continues. One argument in favor of intergovernmentalism, advanced, for instance, by former ITU Counsellor Richard Hill (now operating his own ITU lobbying organization, delightfully named APIG), goes as follows:

  • Everybody already agrees that governments are sovereign within their own territories.
  • Other than a few “separatists,” everyone agrees that national laws apply to use of the Internet within national borders.
  • It may be advantageous to “harmonize” national laws concerning the Internet.
  • Harmonization of national laws happens through intergovernmental organizations, such as the ITU.
  • Therefore, intergovernmental organizations such as the ITU should have a role in Internet governance.

My purpose in this post is to unpack the third premise. Who exactly benefits (and who is harmed) when national governments harmonize their national laws concerning the Internet?

One way to begin to answer this question is to see which governments think they would benefit from a greater intergovernmental role. One rough metric might be International Telecommunication Regulations (Dubai, 2012) signatories. In the map below, signatories are shown in black.

If it’s not clear from the map, there is a strong correlation between authoritarianism and support for the ITRs. Ninety-one percent of those countries ranked as Full Democracies in the Democracy Index opposed the ITRs, while 91 percent of those countries listed as “Authoritarian” supported them.

What national laws do these authoritarian regimes believe need harmonization? I am not privy to any government’s internal deliberations, but as The Economist reports, many of these countries are engaged in “monitoring, filtering, censoring and criminalising free speech online.” It seems to me that the most reasonable hypothesis is that countries like Algeria, Saudi Arabia, Bahrain, China, United Arab Emirates, Russian Federation, Iraq, and Sudan would benefit from a “national Internet segment” because it would normalize the idea of such monitoring and censorship.

In other words, authoritarian regimes favor intergovernmental “harmonization” of national Internet laws because it would enable them to get away with more authoritarianism. China already basically operates a “national Internet segment;” traffic into and out of China is filtered by the government. It is going to be a problem for the Chinese government when its subjects become wealthier, more empowered, and ultimately able to point to Internet policy outside of China and politely ask why part of the Chinese Internet is missing. If other countries were to adopt national Internet segments, the Chinese government would be able to avoid this uncomfortable conversation.

The “cooperation” that is likely to result from intergovernmental Internet policymaking is not the solving of communications problems, which is already accomplished quite ably through international technical organizations such as the Internet Engineering Task Force, but a kind of collusion. If we all agree to respect each other’s right to control information within our respective borders, say the authoritarian regimes, we can tame the more revolutionary aspects of the Internet and solidify our grip on power.

In practice, therefore, intergovernmentalism seems to enable national policies that are not only deplorable from a broadly liberal perspective, but illegal under international law. The Universal Declaration of Human Rights, Article 19 reads:

Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.

Intergovernmentalism should be opposed, therefore, not merely by “separatists,” those who believe national governments have no business applying national law to the Internet. It should be opposed by anyone who does not wish to advance the agenda of censorship.

 

]]>
https://techliberation.com/2013/04/09/whats-wrong-with-intergovernmentalism/feed/ 3 44459
How ARIN and U.S. Commerce Department were duped by the ITU https://techliberation.com/2013/03/29/how-arin-and-u-s-commerce-department-were-duped-by-the-itu/ https://techliberation.com/2013/03/29/how-arin-and-u-s-commerce-department-were-duped-by-the-itu/#comments Fri, 29 Mar 2013 12:37:34 +0000 http://techliberation.com/?p=44385

ARIN is the Internet numbers registry for the North American region. It likes to present itself as a paragon of multistakeholder governance and a staunch opponent of the International Telecommunication Union’s encroachments into Internet governance. Surely, if anyone wants to keep the ITU out of Internet addressing and routing policy, it would be ARIN. And conversely, in past years the ITU has sought to carve away some of the authority over IP addressing from ARIN and other RIRs.

But wait, what is this? March 15 the ITU Secretary-General released a preparatory report for the ITU’s World Telecommunications Policy Forum, which will take place in Geneva May 14-16. The report contains 6 Internet-related policy resolutions “to provide a basis for discussion …focusing on key issues on which it would be desirable to reach conclusions.” Draft Opinion #3 pertains to Internet addressing. Among other things, the draft resolves:

  • “that needs-based address allocation should continue to underpin IP address allocation, irrespective of whether they are IPv6 or IPv4, and in the case of IPv4, irrespective of whether they are legacy or allocated address space;
  • “that all IPv4 transactions be reported to the relevant RIRs, including transactions of legacy addresses that are not necessarily subject to the policies of the RIRs regarding transfers, as supported by the policies developed by the RIR communities;”
  • “that policies of inter-RIR transfer across all RIRs should ensure that such transfers are needs based and be common to all RIRs irrespective of the address space concerned.”

These policy positions thrust the ITU and its intergovernmental machinery directly into the realm of IP addressing policy. But that is quite predictable; the ITU has always wanted to do that. What is unusual about these resolutions is that they bear an uncanny resemblance to the policy positions currently advocated by ARIN and the U.S. Department of Commerce.

In other words, far from challenging the authority of the RIRs, as it used to do, the ITU now seems to be supinely issuing policy positions that reflect the interests of the RIRs. And after checking with sources who were at the meetings where these draft opinions were created, I confirmed that it was indeed ARIN staff, other RIRs and U.S. Commerce Department representatives who pushed for these positions. Indeed, some sources complained that the whole discussion was completely dominated by RIRs and the U.S.; hardly anyone else was participating.

This is a rather significant turn of events. If nothing else, it makes you think twice about the claims coming out of Dubai that the Internet’s organic multistakeholder institutions were locked in a to-the-death struggle with the forces of repression and authoritarianism in the ITU.

Why did this happen?

As we have noted in earlier blogs, ARIN’s staff and board cling to needs-based address allocations because it gives them control, and they want to retain policy authority over legacy address block holders – because it gives them control. Yet its authority over legacy holders is questionable, to say the least. Legacy block holders not only have no contract with ARIN, they received their number blocks before ARIN existed. Many of them would like to be able to sell numbers to any buyer, regardless of ARIN approvals or needs assessments. ARIN’s current leadership just can’t bring itself to accept this.

Apparently, ARIN is so desperate to validate its shaky claim of authority over legacy address space that it will go to any lengths to find support for it – including inserting its policy preferences into an ITU resolution.

What the geniuses at Commerce and ARIN do not seem to understand is that by getting ITU to be its sock puppet, they are also legitimizing the notion that the ITU and its collection of governments have a legitimate role to play in making and enforcing IP address policy. And yet there is a nice bargain here: ARIN uses the ITU process to validate its position; ITU validates it process by having it used by ARIN.

It is clear that the ITU no longer cares much what the substantive policy is, it just wants to be recognized as a platform for global internet policy. Indeed, it is ironic that just as the more enlightened sections of the Internet technical community are starting to question or openly reject needs assessment, the ITU is just starting to embrace it. Insert your favorite joke about regulatory dinosaurs here: by the time the ITU starts endorsing the conventional wisdom, it’s probably no longer wisdom.

]]>
https://techliberation.com/2013/03/29/how-arin-and-u-s-commerce-department-were-duped-by-the-itu/feed/ 4 44385