Broadband & Neutrality Regulation

I found this article by Ernesto over at TorrentFreak (“Decluttering The Tubes, Solutions to the BitTorrent “Problem”?“) to be very interesting and open-minded, but his readers are really taking him to task for it. In the piece, Ernesto outlines the upsides and downsides of 6 possible ISP responses to the “BitTorrent Problem,” which has been in the news a great deal lately. (These models were apparently suggested to Ernesto by Art Reisman, who is chief technical officer at APConnections):

1) Ask for voluntary cooperation. 2) Keep connections within the providers network. 3) Usage based quotas. 4) Limit the total connections allowed at one time per user. 5) Build out networks to handle the increased load and pass the cost onto the consumer. 6) Cancel the service of users who abuse their privileges. There have been reports of providers doing this already.

[Again, see full article for explanation of strengths and weaknesses of each.]

I think many of these solutions sound quite constructive and could possibly be used in some combination to alleviate network congestions problems. But the reader response over at TorrentFreak, which obvious skews towards the heavy BitTorrent user, is perhaps all too predictable: Just give us more capacity!

Continue reading →

Big Business vs. Regulation

by on February 4, 2008 · 6 comments

Quote of the Day:

Charles Francis Adams, Jr., then director, and soon to become president of the Union Pacific… revealed to Long on March 1 why railroads were soon [in 1884] to bring all their weight behind the commission form of regulation. Indeed, he suggested the whole course of subsequent big business attitudes toward federal regulation: “If you only get an efficient Board of Commissioners, they could work out of it whatever was necessary. No matter what sort of bill you have, everything depends upon the men who, so to speak, are inside of it, and who are to make it work. In the hands of the right men, any bill would produce the desired results.”

Three years later, Congress created the Interstate Commerce Commission, which just as Adams had hoped, gradually transformed the railroad industry into a government-run cartel, reversing the rapidly-falling rates of the pre-regulation period.

Badge1.JPG

Tennessee has a proposal to create a “Tennessee community conscious Internet provider” seal to be awarded by the consumer affairs division. A bill introduced in the Tennessee General Assembly – HB 2530 – would award a seal to ISPs that:

1) retain IP addresses for 2 years;

Art Brodsky’s 4,789-word article about Connect Kentucky and its offspring Connected Nation has been the talk of telecom circles over the past week.

Connected Nation is a non-profit entity that has become one of biggest players in the currently topical field of broadband data. Using their work in Kentucky as a model for mapping out broadband availability nation-wide, the group has become a driving force behind legislation that would provide grants for other states to duplicate these efforts.

Examples of legislation following the Connect Kentucky model are the Senate version of the current farm bill, H.R. 4212, which includes Illinois Democratic Sen. Richard Durbin’s “Connect the Nation Act,” S. 1190. Durbin’s bill would authorize $40 million a year, for five years, to state efforts to map out broadband inventory on the census block level.

The “Broadband Data Improvement Act,” S. 1492, by Senate Commerce Committee Chairman Daniel Inouye, D-Hawaii, takes a similar approach. The goal is, in the identical language of both bills, to “identify and track the availability and adoption of broadband services within each State.”

Continue reading →

AP reports that Time Warner Cable will soon begin to experiment with metered pricing, an idea Adam has touted here several times.

Update: Obviously, I’m asleep at the switch. Adam posted about this already. Mike Masnick has his thinking up at Techdirt. I’m going back to bed now.

Well, I’m a bit scared to say this since I will almost certainly incur the wrath of Mike over at TechDirt as well as a host of others who oppose this concept, but I hope there is some truth to the rumor that Time Warner Cable (TWC) is considering a broadband metering experiment down in Texas. (Seriously, go easy on me Mike!)

According to a leaked internal memo posted over on DSL reports:

The introduction of Consumption Based Billing will enable TWC to charge customer based upon usage, impacting only 5% of subscribers who utilize over half of the total network bandwidth. The trial in the Beaumont, TX division will apply to new HSD customer only, will provide a destination for customer to track usage for each month and will enable customers to upgrade from one tier to the next to avoid payment of overage charges. Existing and new subscribers will have tracking capability, however only new subscribers will be charged incrementally for bandwidth usage above the cap. Following the trial, a determination will be made as to whether or not existing subscribers should be charged. Only residential subscribers will be impacted. Trial in Beaumont, TX will begin by Q1.

I don’t want to rehash the whole debate about the relative merits of metered bandwidth–for that, see this, this and this (+ all the comments)–rather, I just think it will be interesting to see the results of a small-market experiment. Will consumers revolt against the idea since it runs counter to the “all-you-can-eat” buffet-style pricing we’ve grown accustomed to? Or will they embrace metering as potential money-saving business model that helps lower monthly bills for light Net users.

Continue reading →

C0827E77-9F08-47A0-A0D5-5F4584F82A3B.jpgFor someone who’s portrayed as an economic reformer that understands, for example, why a 35-hour workweek is a disastrous idea, French President Nicolas Sarkozy’s newly announce plan to tax Internet connections to subsidize television is quite shocking. From the IHT:

But France, like other countries around the world, is struggling to find ways to keep cultural industries, like video and music, afloat at a time when their traditional audiences are waning. Sarkozy, proposing “a real cultural revolution” and stressing twice that his proposal was “unprecedented,” said: “I want us to profoundly review the requirements of public television and to consider a complete elimination of advertising on public channels.” Instead, he said, those channels “could be financed by a tax on advertising revenues of private broadcasters and an infinitesimal tax on the revenues of new means of communication like Internet access or mobile telephony.”

Do I really have to spell out how this not only props up an antiquated technology that people seem not to want, but simultaneously stifles innovation of the technology that people do want? You know, maybe we should tax digital cameras to subsidize Kodak’s film technology.

ISPs Aren’t “Editors”

by on January 10, 2008 · 7 comments

I also disagreed with this part of Yoo’s argument:

The Internet has historically been regarded as a “pull” technology in which end users specified the exact content that they wished to see. The explosion of content on the World Wide Web has increasingly given the Internet the characteristics of a “push” technology in which end users rely on intermediaries to aggregate content into regular e-mail bulletins. Even search engine technologies have begun to exhibit forms of editorial discretion as they begin to compete on the quality of their search methodologies. Mandating content nondiscrimination would represent an ill-advised interference with the exercise of editorial discretion that is playing an increasingly important role on the Internet. Editors perform numerous functions, including guaranteeing quality and ensuring that customers receive an appropriate mix of material. For example, consider the situation that would result if a publication such as Sports Illustrated could not exercise editorial control over its pages. One particular issue of the magazine might consist solely of articles on one sport without any coverage of other sports, and there would be no way to guarantee the quality of the writing… The same principles apply to the Internet as it moves away from person-to-person communications to media content. This shift argues in favor of allowing telecommunications networks to exercise editorial control. Indeed, anyone con- fronting the avalanche of content available on the Internet can attest to the benefits provided by editorial filters. This transition also weakens the case for network neutrality.

I think this misfires on several levels. The first is that he’s mischaracterizing what advocates of network neutrality regulations are trying to accomplish. I don’t know of any prominent advocates of regulation who think the regulations should apply to Google’s search engine, much less Sports Illustrated’s home page. Of course editorial discretion is important in a world of increasing information.

Continue reading →

One of the things I disagreed with in Yoo’s paper is that he puts a lot of stock in the notion that Akamai is a violation of network neutrality. Akamai is a distributed caching network that speeds the delivery of popular content by keeping copies of it at various points around the ‘net so that there’s likely to be a cache near any given end user. Yoo says that the existence of Akamai “attests to the extent to which the Internet is already far from ‘neutral.'” I think this is either an uncharitable interpretation of the pro-regulation position or a misunderstanding of how Akamai works.

Network neutrality is about the routing of packets. A network is neutral if it faithfully transmits information from one end of the network to the other and doesn’t discriminate among packets based on their contents. Neutrality is, in other words, about the behavior of the routers that move packets around the network. It has nothing to do with the behavior of servers at the edges of the network because they don’t route anyone’s packets.

Now, Yoo thinks content delivery networks like Akamai violate network neutrality:

When a last-mile network receives a query for content stored on a content delivery network, instead of blindly directing that request to the designated URL, the content delivery network may redirect the request to a particular cache that is more closely located or less congested. In the process, it can minimize delay and congestion costs by taking into account the topological proximity of each server, the load on each server, and the relative congestion of different portions of the network. In this manner, content delivery networks can dynamically manage network traffic in a way that can minimize transmission costs, congestion costs, and latency… The problem is that content delivery networks violate network neutrality. Not only does URL redirection violate the end-to-end argument by introducing intelligence into the core of the network; the fact that content delivery networks are commercial entities means that their benefits are available only to those entities willing to pay for their services.

I think Yoo is misreading how Akamai works because he’s takes the word “network” too literally. Content delivery networks are not “networks” in the strict sense of physical infrastructure for moving data around. The Akamai “network” is just a bunch of servers sprinkled around the Internet. They use vanilla Internet connections to communicate with each other and the rest of the Internet. Internet routers route Akamai packets exactly the same way they route any other packets.

The “intelligence at the core of the network” Yoo discusses doesn’t actually exist in routers (which would violate network neutrality), but in Akamai’s magical DNS servers. DNS is the protocol that translates a domain name like techliberation.com to an IP address like 72.32.122.135. When you query an Akamai DNS server, it calculates which of its thousands of caching servers is likely to provide the best performance for your particular request (based on your location, the load on various servers, congestion, and other factors) and returns its IP address. Now, from the perspective of the routers that make up “the core of the network,” DNS is just another application, like the web or email. Nothing a DNS server does can violate network neutrality, just as nothing a web server does can violate network neutrality, because both operate entirely at the application layer.

So in a strict technical sense, Akamai is entirely consistent with network neutrality. It’s an ordinary Internet application that works just fine on a vanilla Internet connection. Now, it is true that one of the way Akamai enhances performance is by placing some of its caching servers inside the networks of broadband providers. This improves performance by moving the servers closer to the end user, and it saves broadband providers money by minimizing the amount of traffic that traverses their backbones. This might be a violation of some extremely broad version of network neutrality, and there’s certainly reason to worry that an overzealous future FCC might start trying to regulate the relationship between ISPs and Akamai. But Akamai is not, as Yoo would have it, evidence that the Internet is already non-neutral.

I just finished a second read-through of Chris Yoo’s excellent paper, “Network Neutrality and the Economics of Congestion.” It’s an excellent paper, and in this post I’m going to highlight some of the points I found most compelling. In a follow-up post I’ll offer a few criticisms of parts of the paper I didn’t find persuasive.

One point Yoo makes very well is that large companies’ ability to coerce their media environment is often overrated. He points out that people often overestimate the ability of media companies to dominate the online discussion. He points out, for example, that fears that the AOL Time Warner merger would become an unstoppable online juggernaut turned out to be overblown. The merged firm turned out to have little ability to shape the browsing habits of AOL customers, and AOL continued to bleed customers.

Similarly, Yoo makes the important point that when evaluating the ability of a broadband provider to coerce a website operator, it is the broadband company’s national market share, not its local market share, that matters:

application and content providers care about the total number of users they can reach. So long as their total potential customer base is sufficiently large, it does not really matter whether they are able to reach users in any particular city. This point is well illustrated by a series of recent decisions regarding the market for cable television programming. As the FCC and the D.C. Circuit recognized, a television programmer ’s viability does not depend on its ability to reach viewers in any particular localities, but rather on the total number of viewers it is able to reach nationwide. So long as a cable network can reach a sufficient number of viewers to ensure viability, the fact that a particular network owner may refuse carriage in any particular locality is of no consequence. The FCC has similarly rejected the notion that the local market power enjoyed by early cellular telephone providers posed any threat to the cellular telephone equipment market, since any one cellular provider represented a tiny fraction of the national equipment market. Simply put, it is national reach, not local reach, that matters. This in turn implies that the relevant geographic market is a national one, not a local one. What matters is not the percentage of broadband subscribers that any particular provider controls in any geographic area, but rather the percentage of a nationwide pool of subscribers that that provider controls. Once the relevant market is properly defined in this manner, it becomes clear that the broadband market is too unconcentrated for vertical integration to pose a threat to competition. The standard measure of market concentration is the Hershman-Hirfindahl Index (HHI), which is calculated by summing the squares the market shares of each individual firm. The guidelines employed by the Justice Department and the Federal Trade Commission establish 1800 as the HHI threshold for determining when vertical integration would be a cause for anticompetitive concern. The FCC has applied an HHI threshold of 2600 in its recent review of mergers in the wireless industry. The concentration levels for the broadband industry as of September 2005 yields an HHI of only 1110, well below the thresholds identified above. The imminent arrival of 3G, WiFi, WiMax, BPL, and other new broadband technologies promises to deconcentrate this market still further in the near future.

To put this in concrete terms, if Verizon wants to twist Google’s arm into paying fees for the privilege of Google’s customers, Verizon’s relatively limited national market share (about 9 percent) doesn’t give it a whole lot of leverage. It is, of course, important to Google to be able to reach Verizon’s customers, but Google has enough opportunities in the other 91 percent of the broadband market that it wouldn’t be catastrophic if Verizon blocked its customers from accessing Google sites. Conversely, Google would have a strong incentive not to accede to Verizon’s demands because if it did so it would immediately face demands from the other 91 percent of the broadband market for similar payoffs. Which mens that Verizon, knowing that blocking Google would be a PR disaster for itself and that Google would be unlikely to capitulate quickly, isn’t likely to try such a stunt.

The analysis would be different if we had a single firm that controlled a significant chunk of the broadband market—say 50 percent. But there aren’t any firms like that. The largest appear to be Comcast and AT&T, with about 20 percent each. That’s a small enough market share that they’re unlikely to have too much leverage over web providers, which makes me think it unlikely that broadband providers would have the ability to unilaterally impose a “server pays” regime on them.