Peering and Transit at Ars

by on September 2, 2008 · 5 comments

My favorite thing about Ars Technica (aside from the fact that I get to write for them) is their in-depth features on technical issues. Out today is the best discussion I’ve seen of transit and peering for the lay reader. One key section:

I once heard the following anecdote at a RIPE meeting.

Allegedly, a big American software company was refused peering by one of the incumbent telco networks in the north of Europe. The American firm reacted by finding the most expensive transit route for that telco and then routing its own traffic to Europe over that link. Within a couple of months, the European CFO was asking why the company was paying out so much for transit. Soon afterward, there was a peering arrangement between the two networks…

Tier 1 networks are those networks that don’t pay any other network for transit yet still can reach all networks connected to the internet. There are about seven such networks in the world. Being a Tier 1 is considered very “cool,” but it is an unenviable position. A Tier 1 is constantly faced with customers trying to bypass it, and this is a threat to its business. On top of the threat from customers, a Tier 1 also faces the danger of being de-peered by other Tier 1s. This de-peering happens when one Tier 1 network thinks that the other Tier 1 is not sufficiently important to be considered an equal. The bigger Tier 1 will then try to get a transit deal or paid peering deal with the smaller Tier 1, and if the smaller one accepts, then it is acknowledging that it is not really a Tier 1. But if the smaller Tier 1 calls the bigger Tier 1’s bluff and actually does get de-peered, some of the customers of either network can’t reach each other.

When I first learned about the Internet’s basic peering model, it seemed like there was a real danger of a natural monopoly developing if too many tier 1 providers merge or collude. But what this misses is that larger networks are facing a constant threat of having their customers bypass them and peer directly with other customers. As a result, even if there were only one tier 1 provider, that provider wouldn’t have that much monopoly power, because any time it raised its prices it would see its largest customers start building out infrastructure to bypass its network.

In effect, the BGP protocol that controls the interactions of the various network creates a highly liquid market for interconnection. Because a network has the technical ability to change its local topology in a matter of hours, it’s always in a reasonably strong bargaining position, even when dealing with a larger network.

Things are trickier in the “last mile” broadband market, but at least if we’re talking about the Internet backbone, this is a fiercely competitive market and seems likely to remain that way for the foreseeable future.

Google is entering the browser wars today (if any such war still exists) with the launch of Chrome, its new web browser.  I’m glad to see more competition in browsers as I think—and I hope everyone else agrees with me—that Firefox is the only real game in town. I know that Internet Explorer is more popular, but that seems to only be because it is shipping with every Windows PC and because many enterprise web applications require IE’s non-standard browser. Firefox is preferred browser for anyone who works with the web regularly and has bothered to compare browsers.

One implication of this foray by Mountain View into the browser arena is that—should Chrome be at all successful—they will soon be accused of using their supposed search monopoly to squeeze out competition from IE and Firefox. That is assuming that anything in Chrome favors Google’s search, like making it the default search engine for the browser, which I’m sure it will be.

It’s funny to think that Microsoft, the poster-child from antitrust suits, could be the one launching such a suit. Just a few short years ago we saw Microsoft scoffing at the very notion of antitrust or monopoly power, arguing that it in no way used its market share to its own advantage. Now we see Redmond lashing out against Google as a monopolist. At a recent conference I had the unpleasant experience of watching a panel on online advertising devolve into a fight between the Microsoft and Google reps over whether Google was a search and advertising monopolist.

Continue reading →

Variations on a theme:

* “The net regards censorship as a failure, and routes around it.” —  John Gilmore, SUN Microsystems & EFF co-founder.

* “The net regards hierarchy as a failure, and routes around it.” — Mark Pesce, Writer, consultant, Sydney, Australia

* “The web regards centralization as a failure, and routes around it… by moving to the edge.” — Stowe Boyd, /Message blog

* “The net regards the middleman as a failure, and routes around it.” — Terry Heaton, PoMo Blog

Anybody have any others to add?

John Markoff had an interesting article in the New York Times this weekend entitled “Internet Traffic Begins to Bypass the U.S..” In the piece, Markoff notes that “The era of the American Internet is ending” since “data is increasingly flowing around the United States,” instead of all flowing though our country, as it once did. Markoff focuses on how that “may have intelligence — and conceivably military — consequences.”
Net traffic
Indeed, it may. But what I also found interesting about this fact is the implications it will have for the future of content regulation. As Harvard’s Yochai Benkler told the Times, “This is one of many dimensions on which we’ll have to adjust to a reduction in American ability to dictate terms of core interests of ours.” Content controls are one way that lawmakers enforce what they perceive to be a country’s “core interests.” As less and less Internet traffic flows through the U.S., it could become increasingly difficult for American lawmakers to impose their particular vision or morality on the Internet.

And that’s both good and bad news.

Continue reading →

Cable as Moby DickWell, another four months have passed since I last asked this question, but let me pose it again: Where exactly is the FCC’s Video Competition Report and why is it taking so long to get it out the door? It wouldn’t have anything to do with a certain Chairman Ahab still trying to get his cable whale, would it? No, of course not. I’m sure there’s a perfectly rational reason that this 13th Annual report is now something like 18 months past due altogether. Right.

And keep in mind that the data in the 13th report is for a period ending on June 30, 2006, so whenever the report finally comes out the data in it will be well over two years old! That won’t exactly reflect the true state of the video programming market considering the significant changes we have since that time, especially the continued explosive growth of online video, VOD, and DVRs.

The reason that I have been making a big deal out of this issue is because this gets to the question of just how “scientific” and “independent” of an agency the FCC really is. We are talking about facts here. Basic data. This is stuff the FCC should be routinely collecting and reporting on a timely basis — indeed that is what Congress requires the agency to do in this specific case. And yet the agency can’t do it because its Chairman is on this Moby Dick-like crusade against the cable industry. By the time this 13th annual report finally sees the light of day, the 15th annual report might be due! Outrageous. (And you wonder why many of us here are so skeptical about empowering the FCC regulating the Internet via Net neutrality mandates! If an over-zealous Chairman can politicize this issue, just think what might happen once we give the agency the authority to regulate the Net.)

Anyway, down below you will find the paper that Barbara Esbin and I wrote about the issue four months ago. Perhaps we should place a little ticker somewhere here on the site that counts each day that passes as we wait for the Commission to produce this report. We can take bets on when the agency’s data holdout will end.
Continue reading →

Bandwidth Cap Worries

by on August 30, 2008 · 15 comments

Susan Crawford worries about the implications of Comcast’s bandwidth cap:

Comcast sees a future in which people use the internet to send a few emails or look at a few web pages. They don’t want people watching HD content from other sources online, because that doesn’t fit their business model. So rather than increase capacity, they’d rather lower expectations. 250GB/month is about 50-60 HD movies a month, but we’re not necessarily going to be watching movies. Maybe we’ll be doing constant HD video sessions with other freelancers, or interacting with big groups all over the world in real-time. Who knows what we’ll be doing – it’s all in the future.

But rather than build towards a user-powered future, Comcast wants to shape that future — in advance — in its own image. The company is not offering additional bandwidth packages to people who want more. They just want to be able to shut service off at a particular point – a point of bandwidth use that most people aren’t using right now, so that they won’t be unhappy. By the time we all want to be doing everything online, Comcast users (the company hopes) won’t expect anything better.

There are several observations to make here. In the first place, there isn’t an either-or choice between building more capacity and limiting current users. Comcast is doing both. They’re upgrading to DOCSIS 3.0 at the same time they’re experimenting with new usage limits. Obviously, the ideal situation is when capacity upgrades are sufficient to accommodate increased demand. But if it’s not, network owners have to do something about it. A high and transparent bandwidth cap isn’t a terrible approach.

Second, this cap really is quite high. 250 GB/month is roughly 1 Mbps for every waking hour, or 10 Mbps (which is faster than my current broadband connection) for about 2 hours a day. Her estimate of 50-60 HD movies a month sounds high to me, but certainly there’s enough bandwidth there to download more HD movies than the average family watches in a month.

Third, there’s absolutely no reason to think that this cap is permanent, or that they won’t give consumers reasonable options to get more bandwidth. Comcast is in business to make money. There’s lots of valuable content on the Internet. Therefore, it’s in Comcast’s interest to sell consumers the bandwidth they need to access the Internet content they want. Now, Comcast might charge more for a really high-speed, high-cap Internet access plan. That’s their right, and I’m at a loss to see why it would be a problem. Infrastructure upgrades cost money. It’s only fair to charge the most to the people who use the infrastructure the most. Provided that users do have the option to access the content they want, I fail to see what the problem is.

Finally, Crawford is upset that usage of Comcast’s digital voice service isn’t counted against the cap. But VoIP uses so little bandwidth that as a practical matter, this will matter very little. More to the point, if Crawford is worried about Comcast dedicating bandwidth to its own proprietary services, I’ve got a much bigger target for her to worry about: cable television. Comcast’s cable service has been sucking up bandwidth that could have otherwise gone to Internet connectivity for decades. Does Crawford think it’s unethical for Comcast to offer traditional cable television service? If not, then how is offering dedicated bandwidth to Comcast’s VoIP offering any different?

Read Recently: The Marriage of Elizabeth Barrett Browning and Robert Browning. A remarkable and very non-technological story.

Also: Most of John Dupre’s book Human Nature and the Limits of Science. This turns out to be a critique of two models of human nature, one derived from evolutionary biology and evolutionary psychology, and the other derived from economics. Dupre favors a view of human nature more closely linked to culture, acknowledging the value of diversity. This is a topic well worth writing about; unfortunately, this particular book would have benefited from a vigorous pre-publication critique. Reading it is a lot like having a very frustrating dinner conversation with Dupre, in which interesting arguments are stumbled over, explained only partly, and then abandoned. Continue reading →

Winners and Losers

by on August 29, 2008 · 7 comments

The Federal Communications Commission picks winners and losers, which is why we ought to get rid of it. During the chairmanship of Reed E. Hundt, the losers were incumbent phone companies, whom Hundt considered too Republican. Now it is a cable company, who some consider too Democratic.

The FCC issued an order last week concluding that Comcast acted discriminatorily and arbitrarily to squelch the dynamic benefits of an open and accessible Internet, and that its failure to disclose it’s practices to its customers has compounded the harm. Wow. The FCC will require Comcast to end its network management practices and submit a compliance plan, which is code for submitting to bureaucratic micromanagement.

FCC Chairman Kevin Martin recently asked, “Would you be OK with the post office opening your mail, deciding they didn’t want to bother delivering it, and hiding that fact by sending it back to you stamped ‘address unknown – return to sender’?”

Martin, who the Wall Street Journal identifies as one of the Bush administration’s more questionable personnel picks, lately has become a bit excitable.

Martin is upset with Comcast because it rejects his hypothesis that allowing consumers to pay only for the cable channels they prefer would reduce cable rates.

Martin sided with the commission’s two Democrats to slam Comcast for managing its broadband network like a traffic cop who works hard to prevent gridlock.

Continue reading →

Stephen Schultze is an up-and-coming technology policy analyst who is a fellow at the Berkman Center for Internet and Society at Harvard University. He is also finishing up his Masters of Science in Comparative Media Studies up at MIT. He’s been kind enough to stop by here at the TLF on occasion and comment on some of the things we have written — usually to give us grief, but we welcome that too! He’s very sharp and always has something of substance to say, and he says it in a respectful way. So I look forward to many years of intellectual combat with him. (Incidentally, we also share a mutual admiration for the work of Ithiel de Sola Pool, especially his 1983 classic, “Technologies of Freedom: On Free Speech in an Electronic Age , which I have noted is my favorite tech policy book of all-time.]

Anyway, Stephen has just posted his master’s thesis: “The Business of Broadband and the Public Interest: Media Policy for the Network Society.” It’s a noble attempt to defend and extend the “public interest” concept in the Digital Age. Stephen attempts to “identify the several dimensions in which it remains relevant today.” In his thesis, Stephen cites some of my past work on the issue since I have articulated a very different view on the issue. Specifically, he cites a line of mine that I have used in multiple studies and essays on the issue:

“The public interest standard is not really a “standard” at all since it has no fixed meaning; the definition of the phrase has shifted with the political winds to suit the whims of those in power at any given time.”

I stand by that quote and down below I have pasted a lengthy passage on the mythology surrounding the public interest standard, which I pulled directly from my old 2005 “Media Myths” book. It explains in more detail why I feel that way.

“Right now is a critical point of media in transition that will affect the shape communications ecosystem going forward,” Stephen states in his thesis. I couldn’t agree more, but I completely disagree that that somehow justifies breathing new life into a standard-less standard that justifies open-ended, arbitrary governance of the Internet and digital media. Read on to understand why I feel that way…
Continue reading →

Bruce Schneier has a very good op-ed on the Transportation Security Administration’s airport security programs in the Los Angeles Times today. The winner line: “That’s the TSA: Not doing the right things. Not even doing right the things it does.”

In fairness, security is hard. By their nature, federal agencies aren’t smart and nimble. I argued that the TSA should be scrapped in a March, 2005 Reason magazine debate.

Along similar lines, I found this amusing: