Hal Singer has discovered that total wireline broadband investment has declined 12% in the first half of 2015 compared to the first half of 2014.  The net decrease was $3.3 billion across the six largest ISPs.  As far as what could have caused this, the Federal Communications Commission’s Open Internet Order “is the best explanation for the capex meltdown,” Singer writes.

Despite numerous warnings from economists and other experts, the FCC confidently predicted in paragraph 40 of the Open Internet Order that “recent events have demonstrated that our rules will not disrupt capital markets or investment.”

Chairman Wheeler acknowledged that diminished investment in the network is unacceptable when the commission adopted the Open Internet Order by a partisan 3-2 vote.  His statement said:

Our challenge is to achieve two equally important goals: ensure incentives for private investment in broadband infrastructure so the U.S. has world-leading networks and ensure that those networks are fast, fair, and open for all Americans. (emphasis added.)

The Open Internet Order achieves the first goal, he claimed, by “providing certainty for broadband providers and the online marketplace.” (emphasis added.)

Yet by asserting jurisdiction over interconnection for the first time and by adding a vague new catchall “general conduct” rule, the Order is a recipe for uncertainty.  When asked at a February press conference to provide some examples of how the general conduct rule might be used to stop “new and novel threats” to the Internet, Wheeler admitted “we don’t really know…we don’t know where things go next…”  This is not certainty.

As Singer points out, the FCC has speculated that the Open Internet rules would generate only $100 million in annual benefits for content providers compared to the reduction of investment in the network of at least $3.3 billion since last year.  While the rules obviously won’t survive cost-benefit analysis, I’m not sure they will survive some preliminary questions and even get to a cost-benefit analysis stage. Continue reading →

My friend Tim Lee has an article at Vox that argues that interconnection is the new frontier on which the battle for the future of the Internet is being waged. I think the article doesn’t really consider how interconnection has worked in the last few years, and consequently, it makes a big deal out of something that is pretty harmless.

How the Internet used to work

The Internet is a network of networks. Your ISP is a network. It connects to the other ISPs and exchanges traffic with them. Since connections between ISPs are about equally valuable to each other, this often happens through “settlement-free peering,” in which networks exchange traffic on an unpriced basis. The arrangement is equally valuable to both partners.

Not every ISP connects directly to every other ISP. For example, a local ISP in California probably doesn’t connect directly to a local ISP in New York. If you’re an ISP that wants to be sure your customer can reach every other network on the Internet, you have to purchase “transit” services from a bigger or more specialized ISP. This would allow ISPs to transmit data along what used to be called “the backbone” of the Internet. Transit providers that exchange roughly equally valued traffic with other networks themselves have settlement-free peering arrangements with those networks.

How the Internet works now

A few things have changed in the last several years. One major change is that most major ISPs have very large, geographically-dispersed networks. For example, Comcast serves customers in 40 states, and other networks can peer with them in 18 different locations across the US. These 18 locations are connected to each other through very fast cables that Comcast owns. In other words, Comcast is not just a residential ISP anymore. They are part of what used to be called “the backbone,” although it no longer makes sense to call it that since there are so many big pipes that cross the country and so much traffic is transmitted directly through ISP interconnection.

Another thing that has changed is that content providers are increasingly delivering a lot of a) traffic-intensive and b) time-sensitive content across the Internet. This has created the incentive to use what are known as content-delivery networks (CDNs). CDNs are specialized ISPs that locate servers right on the edge of all terminating ISPs’ networks. There are a lot of CDNs—here is one list.

By locating on the edge of each consumer ISP, CDNs are able to deliver content to end users with very low latency and at very fast speeds. For this service, they charge money to their customers. However, they also have to pay consumer ISPs for access to their networks, because the traffic flow is all going in one direction and otherwise CDNs would be making money by using up resources on the consumer ISP’s network.

CDNs’ payments to consumer ISPs are also a matter of equity between the ISP’s customers. Let’s suppose that Vox hires Amazon CloudFront to serve traffic to Comcast customers (they do). If the 50 percent of Comcast customers who wanted to read Vox suddenly started using up so many network resources that Comcast and CloudFront needed to upgrade their connection, who should pay for the upgrade? The naïve answer is to say that Comcast should, because that is what customers are paying them for. But the efficient answer is that the 50 percent who want to access Vox should pay for it, and the 50 percent who don’t want to access it shouldn’t. By Comcast charging CloudFront to access the Comcast network, and CloudFront passing along those costs to Vox, and Vox passing along those costs to customers in the form of advertising, the resource costs of using the network are being paid by those who are using them and not by those who aren’t.

What happened with the Netflix/Comcast dust-up?

Netflix used multiple CDNs to serve its content to subscribers. For example, it used a CDN provided by Cogent to serve content to Comcast customers. Cogent ran out of capacity and refused to upgrade its link to Comcast. As a result, some of Comcast’s customers experienced a decline in quality of Netflix streaming. However, Comcast customers who accessed Netflix with an Apple TV, which is served by CDNs from Level 3 and Limelight, never had any problems. Cogent has had peering disputes in the past with many other networks.

To solve the congestion problem, Netflix and Comcast negotiated a direct interconnection. Instead of Netflix paying Cogent and Cogent paying Comcast, Netflix is now paying Comcast directly. They signed a multi-year deal that is reported to reduce Netflix’s costs relative to what they would have paid through Cogent. Essentially, Netflix is vertically integrating into the CDN business. This makes sense. High-quality CDN service is essential to Netflix’s business; they can’t afford to experience the kind of incident that Cogent caused with Comcast. When a service is strategically important to your business, it’s often a good idea to vertically integrate.

It should be noted that what Comcast and Netflix negotiated was not a “fast lane”—Comcast is prohibited from offering prioritized traffic as a condition of its merger with NBC/Universal.

What about Comcast’s market power?

I think that one of Tim’s hangups is that Comcast has a lot of local market power. There are lots of barriers to creating a competing local ISP in Comcast’s territories. Doesn’t this mean that Comcast will abuse its market power and try to gouge CDNs?

Let’s suppose that Comcast is a pure monopolist in a two-sided market. It’s already extracting the maximum amount of rent that it can on the consumer side. Now it turns to the upstream market and tries to extract rent. The problem with this is that it can only extract rents from upstream content producers insofar as it lowers the value of the rent it can collect from consumers. If customers have to pay higher Netflix bills, then they will be less willing to pay Comcast. The fact that the market is two-sided does not significantly increase the amount of monopoly rent that Comcast can collect.

Interconnection fees that are being paid to Comcast (and virtually all other major ISPs) have virtually nothing to do with Comcast’s market power and everything to do with the fact that the Internet has changed, both in structure and content. This is simply how the Internet works. I use CloudFront, the same CDN that Vox uses, to serve even a small site like my Bitcoin Volatility Index. CloudFront negotiates payments to Comcast and other ISPs on my and Vox’s behalf. There is nothing unseemly about Netflix making similar payments to Comcast, whether indirectly through Cogent or directly, nor is there anything about this arrangement that harms “the little guy” (like me!).

For more reading material on the Netflix/Comcast arrangement, I recommend Dan Rayburn’s posts here, here, and here. Interconnection is a very technical subject, and someone with very specialized expertise like Dan is invaluable in understanding this issue.

In her UN General Assembly speech denouncing NSA surveillance, Brazil’s President Dilma Rousseff said:

Information and communications technologies cannot be the new battlefield between States. Time is ripe to create the conditions to prevent cyberspace from being used as a weapon of war, through espionage, sabotage, and attacks against systems and infrastructure of other countries. … For this reason, Brazil will present proposals for the establishment of a civilian multilateral framework for the governance and use of the Internet and to ensure the protection of data that travels through the web.

We share her outrage at mass surveillance. We share her opposition to the militarization of the Internet. We share her concern for privacy.

But when President Rousseff proposes to solve these problems by means of a “multilateral framework for the governance and use of the Internet,” she reveals a fundamental flaw in her thinking. It is a flaw shared by many in civil society.

You cannot control militaries, espionage and arms races by “governing the Internet.” Cyberspace is one of many aspects of military competition. Unless one eliminates or dramatically diminishes political and military competition among sovereign states, states will continue to spy, break into things, and engage in conflict when it suits their interests. Cyber conflict is no exception.

Rousseff is mixing apples and oranges. If you want to control militaries and espionage, then regulate arms, militaries and espionage – not “the Internet.”

This confusion is potentially dangerous. If the NSA outrages feed into a call for global Internet governance, and this governance focuses on critical Internet resources and the production and use of Internet-enabled services by civil society and the private sector, as it inevitably will, we are certain to get lots of governance of the Internet, and very little governance of espionage, militaries, and cyber arms.

In other words, Dilma’s “civilian multilateral framework for the governance and use of the Internet” is only going to regulate us – the civilian users and private sector producers of Internet products and services. It will not control the NSA, the Chinese Peoples Liberation Army, the Russian FSB or the British GCHQ.

Realism in international relations theory is based on the view that the international system is anarchic. This does not mean that it is chaotic, but simply that the system is composed of independent states and there is no central authority capable of coercing all of them into following rules. The other key tenet of realism is that the primary goal of states in the international system is their own survival.

It follows that the only way one state can compel another state to do anything is through some form of coercion, such as war, a credible threat of war, or economic sanctions. And the only time states agree to cooperate to set and enforce rules, is when it is in their self-interest to do so. Thus, when sovereign states come together to agree to regulate things internationally, their priorities will always be to:

  • Preserve or enlarge their own power relative to other states; and
  • Ensure that the regulations are designed to bring under control those aspects of civil society and business that might undermine or threaten their power.

Any other benefits, such as privacy for users or freedom of expression, will be secondary concerns. That’s just the way it is in international relations. Asking states to prevent cyberspace from being used as a weapon of war is like asking foxes to guard henhouses.

That’s one reason why it is so essential that these conferences be fully open to non-state actors, and that they not be organized around national representation.

Let’s think twice about linking the NSA reaction too strongly to Internet governance. There is some linkage, of course. The NSA revelations should remind us to be realist in our approach to Internet governance. This means recognizing that all states will approach Internet regulation with their own survival and power uppermost in their agenda; it also means that any single state cannot be trusted as a neutral steward of the global Internet but will inevitably use its position to benefit itself. These implications of the Snowden revelations need to be recognized. But let us not confuse NSA regulation with Internet regulation.

Over the past year, as the debate over internet radio royalty rates has raged, I have been a lonely voice calling for the repeal for compulsory licensing of digital performance rights altogether. I did so at the Cato event for my book, Copyright Unbalanced, in January at a State of the Net panel, and in my Reason column. The reaction I often received was either one of outrage by the Pandoras of the world, or condescension for my naive optimism. Well, optimism can pay off. Yesterday Rep. Mel Watt, ranking member of the House Judiciary Committee’s Subcommittee on Courts, Intellectual Property and the Internet, introduced the “Free Market Royalty Act,” which among other things gets rid of compulsory licensing.

The problem with the compulsory licensing scheme is twofold: Not only does it rely on federal bureaucrats to set the rates that artists must accept for their music (rather than allowing a free-market negotiation take place between copyright holders and those who want to broadcast their songs), but it also allows Congress to pick winners and losers by assigning different royalty rate standards to different users. As I explained in Reason:

While AM, FM, cable and satellite radio, and Internet radio services like Pandora can all opt for compulsory licenses, they each pay different royalty rates. The rates are set by a panel of government lawyers called the Copyright Royalty Board, and they have the effect of favoring some business models over others. Internet radio services pay over 60 percent of their revenue in royalties, while Sirius XM, the only satellite radio company, pays only 8 percent. AM and FM radio aren’t subject to a digital sound recording right, so it pays zero.

Watt’s bill would blow all this up, making terrestrial broadcasters, Internet radio services, and the rest to give up their price-fixed compulsory licenses and have to negotiate in a market the rates they pay. This truly levels the playing field, especially vis-a-vis interactive music services like Spotify and Rdio that have never benefited from compulsory licenses.

Whether you talk to supporters of Rep. Chaffetz’s Internet Radio Fairness Act or Rep. Nadler’s Interim FIRST Act, they each will say their bill is the true fre market approach, and that their rate-setting standard would best approximate a market. To them I say, nothing better approximates a market than the market itself, so if they are truly concerned about ensuring a free market level playing field, here is the way to do it.

One advantage of compulsory licensing is that it can reduce transactions costs. The Watt bill retains some of this advantage by designating SoundExchange, a nonprofit agency, as the common agent for copyright owners to facilitate negotiations, but allowing labels and artists to retain the right to opt-out and negotiate on their own. If this bill passes, I think we’ll see some very interesting experimentation with business models on the part of both the artists and the radio stations.

Finally, looking at the) press coverage of this bill, what has gotten the most attention is that it would, for the first time, require terrestrial AM/FM radio stations to negotiate and pay royalties for the sound recordings they broadcast. The way I see it, it’s not clear to me why broadcasters deserve yet another subsidy, so I shed no tears for them if this bill passes. Broadcasters argue that they provide promotional value for the songs they broadcast, that this benefits copyright holder, and that they should therefore continue to pay nothing. If it is indeed the case that airplay provides substantial promotional value, that will be taken into account in the course of negotiations and we should expect the ultimate rate to reflect that. Indeed, you can even imagine an outcome where the free market rate for terrestrial stations would remain at zero, or even that copyright holders would want to pay the stations. That’s the beauty of the market, so let’s unleash it.

As the “real-world” continues its inexorable march toward our all-IP future, the FCC remains stuck in the mud fighting the regulatory wars of yesteryear, wielding its traditional weapon of bureaucratic delay to mask its own agenda.

Late last Friday the Technology Transitions Policy Task Force at the Federal Communications Commission (FCC) issued a Public Notice proposing to trial three narrow issues related to the IP transition (the transition of 20th Century telephone systems to the native Internet networks of the 21stCentury). Outgoing FCC Chairman Julius Genachowski says these “real-world trials [would] help accelerate the ongoing technology transitions moving us to modern broadband networks.” Though the proposed trials could prove useful, in the “real-world”, the Public Notice is more likely to discourage future investment in Internet infrastructure than to accelerate it. Continue reading →

Remember all the businesses, internet techies and NGOs who were screaming about an “ITU takeover of the Internet” a year ago? Where are they now? Because this time, we actually need them.

May 14 – 21 is Internet governance week in Geneva. We have declared it so because there will be three events in that week for the global community concerned with global internet governance. From 14-16 May the International Telecommunication Union (ITU) holds its World Telecommunication Policy Forum (WTPF). This year it is devoted to internet policy issues. With the polarizing results of the Dubai World Conference on International Telecommunications (WCIT) still reverberating, the meeting will revisit debates about the role of states in Internet governance. Next, on May 17 and 18, the Graduate Institute of International and Development Studies and the Global Internet Governance Academic Network (GigaNet) will hold an international workshop on The Global Governance of the Internet: Intergovernmentalism, Multi-stakeholderism and Networks. Here, academics and practitioners will engage in what should be a more intellectually substantive debate on modes and principles of global Internet governance.

Last but not least, the UN Internet Governance Forum will hold its semi-annual consultations to prepare the program and agenda for its next meeting in Bali, Indonesia. The IGF consultations are relevant because, to put it bluntly, it is the failure of the IGF to bring governments, the private sector and civil society together in a commonly agreed platform for policy development that is partly responsible for the continued tension between multistakeholder and intergovernmental institutions. Whether the IGF can get its act together and become more relevant is one of the key issues going forward.

Continue reading →

Following up on Eli’s earlier post (“Does CDT believe in Internet freedom?”), I thought I’d just point out that we’ve spent a great deal of time here through the years defending real Internet freedom, which is properly defined as “freedom from state action; not freedom for the State to reorder our affairs to supposedly make certain people or groups better off or to improve some amorphous ‘public interest.'” All too often these days, “Internet freedom,” like the term “freedom” more generally, is defined as a set of positive rights/entitlements complete with corresponding obligations on government to delivery the goods and tax/regulate comprehensively to accomplish it.  Using “freedom” in that way represents a grotesque corruption of language and one that defenders of human liberty must resist with all our energy.

I’ll be writing more about this in upcoming columns, but here’s a short list of past posts on Internet freedom, properly defined:

Rather than invest and deploy new networks offering millions of consumers with additional choices for high speed Internet access, CLECs are investing in the regulatory process in hopes the FCC will save them from the inconvenience and expense of transitioning to all-IP infrastructure. The FCC should not allow the self-interest of CLECs to stand in the way of the IP-transition or the delivery of high speed Internet services to millions of residential consumers who demand more choice.

Shortly after AT&T announced “Project Velocity IP,” its plan to invest an additional $14 billion to provide high-speed Internet access to 99 percent of customer locations in its wireline service area, I blogged about the broad consensus among policymakers, pundits, and industry players in support of the announcement. But, “you can never please all of the people all of the time.” Now that the initial buzz around the announcement has abated, the inevitably unpleased few have gone on the offensive.

The few who cannot be pleased by Internet transformation are “competitive local exchange carriers,” also known as “CLECs.” These companies were created in the mid-1990s to provide both residential and business consumers with an additional choice for telephone service, but after the dot-com bubble burst, CLECs chose to limit their offerings to more lucrative business customers in downtown metro areas. They typically do not offer service to residential consumers or businesses that demand additional options in more suburban and rural areas.

While companies that serve all types of American consumers are investing in the transformation of their outdated telephone systems into the all-Internet protocol (IP) infrastructure of the 21st Century to deliver high-speed Internet services to residential consumers, CLECs claim the IP-transition is a “waste of resources” and a “distraction.” Rather than invest and deploy new networks offering millions of consumers with additional choices for high speed Internet access, CLECs are investing in the regulatory process in hopes the FCC will save them from the inconvenience and expense of transitioning to all-IP infrastructure. The FCC should not allow the self-interest of CLECs to stand in the way of the IP-transition or the delivery of high-speed Internet services to millions of residential consumers who demand more choice. Continue reading →

[Updated 7/10/14: See new addendum at bottom. Updated 4/28/13: Included links to several things + started list of additional resources at end.]

Each year I am contacted by dozens of people who are looking to break into the field of information technology policy as a think tank analyst, a research fellow at an academic institution, or even as an activist. Some of the people who contact me I already know; most of them I don’t. Some are free-marketeers, but a surprising number of them are independent analysts or even activist-minded Lefties. Some of them are students; others are current professionals looking to change fields (usually because they are stuck in boring job that doesn’t let them channel their intellectual energies in a positive way). Some are lawyers; others are economists, and a growing number are computer science or engineering grads. In sum, it’s a crazy assortment of inquiries I get from people, unified only by their shared desire to move into this exciting field of public policy.

I always do my best to answer their emails, calls, and requests for meetings. Unfortunately, there’s only so much time in the day and I am sometimes not able to get back to all of them. I always feel bad about that, so, this essay is an effort to gather my thoughts and advice and put it all one place so that I will at least have something to send these folks. Perhaps I’ll try to update it over time.

#1) Understand that Specialization Matters

I don’t want to bury the lede here, so let me start with the most important piece of advice I share with everyone who contacts me: specialization matters. When I got started in the sleepy field of information technology policy back in 1991, it was possible to be a jack-of-all-trades. There were only a few issues that really mattered, and most of them were tied up with traditional communications and media policy. If you knew a little something about telephony, universal service subsidies, spectrum policy, and broadcast regulation, then you could be an analyst in this field. There were only a handful of people in the think tank world back then who even cared about such issues. Continue reading →

In the wake of the election, Matt Hindman, author of The Myth of Digital Democracy, analyzes the effect of the internet on electoral politics. 

According to Hindman, the internet had a large—but indirect—effect on the 2012 elections. Particularly important was microtargeting to identify supporters and get out the vote, says Hindman. Data and measurements—two things that the GOP was once ahead in, but which they have ceded to the Democrats in the past 8 years—played a key role in determining the winner of the presidential election, according to Hindman. 

Hindman also takes a critical look at the blogosphere, comparing it to the traditional media that some argue it is superseding, and he delineates the respective roles played by Facebook and Twitter within the electoral framework.

Download

Related Links