“Revered” Engineering Principles and Other Unpersuasive Arguments

by on November 13, 2006 · 4 comments

For the past few days, TLF’s Tim Lee and Brooke Oberwetter of the Competitive Enterprise Institute have been engaging in a, well, spirited discussion over net neutrality. The whole thing seems to have started when Oberwetter linked approvingly to one of Tim’s TLF posts. Proving that no good deed goes unpunished, Tim responded with a detailed criticism of Oberwetter. Such is the blogosphere.

You can see the whole gory mess here, here, here, and here. Oberwetter argues that tiered pricing for content delivery could potentially benefit consumers, by opening up another dimension of competition. Tim comes down hard on this idea–arguing that such a fee system is unlikely to develop, and in any case would be a bad thing.

At the risk of shattering the image of universal pan-TLF consensus, I have some problems with Tim’s easy dismissal of this potential market development.

My first nit to pick is Tim’s argument that tiered pricing won’t work. He says that content owners wouldn’t be interested in paying more for faster download speeds, since the potential changes wouldn’t provide them any competitive advantage. But competition takes place at the margin–even small differences can have significant effects. (And for some content–say movie downloads–the differences aren’t that marginal).

He also argues that fees are impractical–writing that it would “be a massive administrative task.” Perhaps, but then I’m puzzled as to why network owners seem to think it’s doable. Whether you agree it’s a good thing or not, it’s hard to see why they would fight for the right to do something so clearly impossible. And similar complexity has been overcome elsewhere–Tim himself cites the backbone market, where similar problems have been addressed. This just doesn’t seem to be an insuperable barrier.

This doesn’t mean tiered pricing probably will, or even likely will, develop. Markets–especially dynamic, innovative ones such as this–are unpredictable. The one thing we know is that things will happen that we don’t expect. For that reason, I’m not willing to make firm predictions about the future, and I’m skeptical of anyone who is.

More troubling, Tim goes beyond factual prediction and makes a value judgment that tiered pricing would be a bad thing. He makes clear that he opposes regulation. But he also maintains that such a fee system would violate the “revered” end-to-end principle, and it would “suck” if that happened.

Strangely, in one post, Tim seems to argue that tiered pricing would not violate the end-to-end principle, writing that “adding QoS guarantees to the TCP/IP protocol stack would be a very modest adjustment to the end-to-end principle. Packets might come stamped with a priority level, and route higher-priority packets before lower-priority ones.” Perhaps I have misunderstood his point, but it seems that this is what Brooke was originally suggesting–there’s no need to look further into what content is in each packet.

But would it be a bad thing if tiered pricing did erode the end-to-end principle? Tim argues it would be. In debating this with Oberwetter, he makes an argument from authority that “the vast majority of computer scientists agree with me,” citing Tim Berners-Lee and Google’s Vint Cerf.

I don’t claim to be an expert in computer science, but there are some who do not agree–notably Internet pioneer David Farber. Christopher Yoo of Vanderbilt (albeit a mere economist) makes the case that, even at the beginning, the end-to-end principle was not meant to be an inviolable rule. He writes:

Although the end-to-end argument does support a presumption
against introducing higher-level functions into the network’s core, it does
not justify elevating this presumption into an inviolable precept.
Conceding that it is ”too simplistic to conclude that the lower levels
should play no part in obtaining reliability,” Saltzer, Reed, and Clark’s
original article articulating the end-to-end argument squarely concludes
that ”the end-to-end argument is not an absolute rule, but rather a
guideline that helps in application and protocol design analysis.

Tim also makes an economic argument for the end-to-end principle, arguing that “[c]ompanies like Verizon and Comcast know a lot about how to get packets from point A to point B. They’re generally no so good at designing great computer games, web sites, or video applications.”

But this argument gets us nowhere. I’ve no quarrel with division of labor. It’s a good thing no doubt. But so it integration, and synergy, and all sort of other things. And sometimes that trumps division of labor. Otherwise, I’d be buying the beef for my Big Mac separately from the buns. Division of labor can be good, but sometimes other factors are more important.

The hard part is determining which it is in a particular situation, and what provides the most benefits to consumers. And the marketplace is far better at doing this than any number of revered engineering principles, computer scientists or (god forbid) policy analysts. To make such calls without reference to markets is–in Hayek’s phrase–a fatal conceit.

Comments on this entry are closed.

Previous post:

Next post: