Bill and Keep and the Free Market

by on July 31, 2008 · 20 comments

My last post sparked some interesting discussion about the economics of the Internet. With all due respect to my co-blogger Hance, though, this is precisely the sort of thing I was talking about:

[Tim’s post] unfortunately overlooks the essence of what NN regulation is really about as far as commercial entities are concerned, i.e., profitable online properties don’t want to be asked or obliged to negotiate service agreements with network providers in which they agree to share some of their profits with network providers for the upkeep of the Internet and for the improvement of the overall online experience — just like retailers in a shopping mall share a small percentage of their profits with the landlord.

Bret likewise says that “NN advocates have for several years now wanted to force service providers into one business plan where the end-user pays ALL the costs of the network.” It will surely be news to Eric Schmidtt, Steve Ballmer, and Jerry Yang that they aren’t “obliged to negotiate service agreements with network providers.” In point of fact, Google, Microsoft, Yahoo! and other big service providers pay millions of dollars to their ISPs to help finance the “upkeep of the Internet.” The prices they pay are negotiated in a fully competitive market.

Here’s a thumbnail sketch of how the Internet is structured: it’s made up of thousands of networks of various sizes, with lots and lots of interconnections between them. When two networks decide to interconnect, they typically evaluate their relative sizes. If one network is larger or better-connected than the other, the smaller network will typically pay the larger network for connectivity. If the networks are roughly the same size, they will typically swap traffic on a settlement-free basis.

The result is that a typical packet on the Internet will generally go “upstream” to progressively larger networks, it may then cross a peering point between two equally-sized networks, and then it will go back “downstream” to its destination. Payments along any route flow “upstream” from each end to the peering point in the middle. The net result is that each side of any given connection roughly pays his cost to get to his side of the Internet’s “backbone.” Google pays the cost of getting its traffic to the backbone, and then consumers pay the cost of getting the traffic from the backbone to their homes.

I take Hance’s complaint to be that Google should be paying for more than just its “half” of the connection: that Google should help defray the costs of getting packets from the backbone to individual consumers. To take Hance’s shopping mall example, this would be akin to Macy’s being required to contribute to the upkeep not only of the streets around its store, but also to the residential streets and driveways of every one of its customers, no matter how far away those customers live. That’s not how we do street finance, and it’s not how Internet pricing works either.

He doesn’t really elaborate on why this model would be better, but the primary reason not to do so is a matter of simple arithmetic. Under the present Internet architecture, any given node on the network only has to negotiate contractual relationships with the nodes immediately adjacent to it. For most people, that’s a single payment to the people “upstream.” For ISPs, that may mean paying a handful of different companies for “upstream” connectivity and charging a lot of “downstream” people for access. The result is that the number of contractual relationships is of the same order of magnitude as the number of nodes on the network.

In contrast, if every pair of nodes on the network needed to negotiate a contractual relationship, the number of contracts is potentially the square of the number of nodes. There are about a billion nodes on the Internet, so there could theoretically be 10^18 contracts signed. Even if we just assume contracts between websites and residential ISPs, we’re still talking about millions of websites and thousands of ISPs. Abandoning the current peering model for some kind of cost-recovery model would mean an enormous increase in logistical overhead.

Bret says that “we should let the market find the way in this dynamic arena,” and I agree. The thing is, the market has been finding the way for the last decade; that’s how we got the current “bill and keep” pricing structure. Which is precisely why I think it’s silly for free-market social scientists to be criticizing it. It’s one thing to say that the FCC shouldn’t rule out alternative pricing mechanisms—I quite agree—but it’s quite another to suggest that the status quo is defective and needs to be changed posthaste. I think both the evidence and libertarian theory suggest otherwise.

Previous post:

Next post: