Broadband & Neutrality Regulation

Many in the press (NYT, AP) are commenting this morning about on how Google on Monday encouraged the Federal Communications Commission to design their forthcoming auction of radio-frequencies to take advantage of real-time airwaves auctions. It’s one more bit of news emerging from the 700 Megahertz (MHz) auction, which the FCC must begin before January 2008. In the words of telecom analyst Blair Levin, of Stifel Nicolaus, it is shaping up to be “a pivotal auction” that could provide “new blood for broadband… or [a] telco/cable sweep.”

But there was another noteworthy filing at the FCC on Monday. The White Spaces Coalition — whose members include Dell, EarthLink, Google, Hewlett-Packard, Intel, Microsoft and Philips Electronics — met with commission officials and provided them with a prototype device for operating in vacant television broadcast channels. Philips’ devices joins one previously submitted by Microsoft. (Look at page 3 for a picture of the “Microsoft TV White Spaces Development Platform.”)

Just as the 700 MHz band offers new hope for telecom and video competition, many technology companies are looking to the vacant TV bands. The reason is simple: television channels so scattered, principally because they were designed around the 1940s-era NTSC standard, named after the National Television Standard Committee. As a look at the broadcast band for the ZIP code 20006 demonstrates, using FCC metrics, no more than four of the 21 channels between 30 and 50 are occupied: 32, 45, 47 and 50. That leaves 17 available within the “white spaces” between the frequencies where those stations broadcast. The occupied channel numbers will vary from city to city, which is why advanced sensing capabilities are needed to even begin to complete utilizing the spectrum in the television zone for something other than broadcasting.

or http://www.publicintegrity.org/telecom/telecomwatch.aspx?eid=2940

Over at Cato@Liberty, I’ve got a post making the slightly obvious point that Digg is a microcosm of the Internet as a whole. Digg, like the Internet as a whole, is an automated and decentralized information-processing system. And just as Digg ultimately faced a choice between allowing the AACS key to be on the site or shutting the site down, we face the same basic choice as a society: unless we want to shut down the Internet (or radically redesign it, which could amount to the same thing) we’ve got little choice but to allow some level of illicit content to be traded.

This seems to me to be a nice illustration of a point that I’ve often tried to make about the network neutrality debate, because it seems to me that the telcos face a similar challenge with regard to their management of their networks. Many of the horror stories pro-regulatory types tell about a post-neutrality future assume that the telcos have fine-grained control over what kind of content flows over their networks. That they’re censor liberal blogs, or shut down particular categories of new innovative applications, or sign exclusive deals where (say) one sports website is the official sports website, and all the others are blocked or degraded.

But an ISP attempting to implement such a fine-grained, coercive strategy on a user base numbering in the millions is likely to find their users reacting in creative ways that confound the scheme. Tech-savvy users will immediately start running services on non-standard ports or tunneling their connections over encrypted links. They’ll find ways to camouflage one category of traffic as another, such as making a VoIP session look like a World of Warcraft game. Soon you’d start seeing user-friendly applications available for download to allow moderately tech-savvy users to use the same tricks. And applications developers will start integrating these tricks into their applications, so that the application will automatically detect whose network they’re on and use the appropriate countermeasure.

(Geeky aside: it’s possible to imagine open source networking libraries that do this automatically and transparently, presenting an API that allows the application developer pretend he’s on a normal, open network. Indeed, I bet you’d end up with a situation similar to the situation we saw with open source instant messaging libraries a couple years ago: the telco would introduce new routing polices in an effort to break unauthorized applications. The creators of the circumvention libraries would find a new work-around, publish it, and all the application developers would have to do would be to download the new library and recompile.)

Of course, the telcos could always go for the nuclear option and block all traffic it can’t validate as “approved,” effectively converting the open network into a closed one But that would come at a very high price, because there’s a long tail of content and along tail of applications. An Internet that only does the things on your ISP’s approved list is dramatically less useful than an open Internet, just as Digg would be a dramatically less successful site if it only featured stories that had been pre-vetted by the telco’s employees.

So while telcos may have formal control over their pipes, they probably have less practical control over Internet content than is generally assumed. An open network is much more useful to users (and will therefore generate more revenue) than a closed one, but once you have an open network it’s very hard to limit how it’s used.

Lots going on this week on the wireless Net neutrality front. You will recall that a couple of weeks ago several of us here were blasting the new paper by Tim Wu and the petition by Skype asking the FCC to impose Carterfone-like regulatory mandates on the wireless industry. This new battle is now just known as “the wireless Net neutrality fight” here in Washington. And this week some important studies have been released opposing it by the CTIA, the wireless industry’s trade association, and economists from the American Enterprise Institute, Brookings Institution, and the Phoenix Center. I don’t have time to summarize them, but here are the links to each major report if you are interested:

(1) Filing of CTIA – The Wireless Association In the Matter of Skype Communications Petition to Confirm A Consumer’s Right to Use Internet Communications Software and Attach Devices to Wireless Networks (April 30, 2007).

(2) Robert W. Hahn, Robert E. Litan, and Hal J. Singer, “The Economics of ‘Wireless Net Neutrality,'” AEI-Brookings Joint Center for Regulatory Studies, AEI-Brookings Joint Center Working Paper No. RP07-10, (April 2007).

(3) George S. Ford, Thomas M. Koutsky and Lawrence J. Spiwak, “Wireless Net Neutrality: From Carterfone to Cable Boxes,” PHOENIX CENTER POLICY BULLETIN No. 17 (April 2007).

digg_url = ‘http://digg.com/podcasts/Tech_Policy_Weekly_from_The_Technology_Liberation_Front‘;

http://digg.com/tools/diggthis.js

If you look back at all the writing we have done here on Net Neutrality (NN), it seems to me that the common theme of our collective opposition to regulation is that we just don’t know what we’re getting ourselves into. No doubt, we’re skeptics about most regulatory proposals, but with good reason. Our government does not have a very good track record when it comes to regulating communications or high-technology markets for the purposes of improving consumer welfare. In fact, just the opposite is usually the result. Consumers typically are on the losing end of grandiose regulatory schemes that are suppose to serve “the public interest.” As a century’s worth of communications industry regulation proved, regulation typically results in stagnant markets, lack-luster innovation and limited consumer choice.

That’s why yesterday’s new Notice of Inquiry about Net neutrality from the Federal Communications Commission (FCC) has me so worried. It tees up all the questions that we’ve been asking here for the past few years. The difference is, of course, that now the whole world is going to flood the agency with answers and many of them will entail regulatory action.

Just the way the FCC frames some of the questions in this Notice concerns me, especially in terms of the breadth of what the agency is investigating. Consider how the discussion kicks off:

Continue reading →

Here’s a column I wrote recently on the connection between the two.

digg_url = ‘http://digg.com/podcasts/Tech_Policy_Weekly_from_The_Technology_Liberation_Front‘;

http://digg.com/tools/diggthis.js

WASHINGTON, April 11, 2007 – The wireless industry association CTIA has retained an economic consulting firm run by the former boss of FCC Chairman Kevin Martin to poke holes in proposals modifying a forthcoming auction of radio frequencies.

Continue reading →

Japan has 7.2 million all-fiber broadband subscribers who pay $34 per month and incumbent providers NTT East and NTT West have only a 66% market share. According to Takashi Ebihara, a Senior Director in the Corporate Strategy Department at Japan’s NTT East Corp. and currently a Visiting Fellow at the Center for Strategic and International Studies here in Washington, Japan has the “fastest and least expensive” broadband in the world and non-incumbent CLECs have a “reasonable” market share. Ebihara was speaking at the Information Technology and Innovation Foundation, and his presentation can be found here. Ebihara said government strategy played a significant role. Local loop unbundling and line sharing led to fierce competition in DSL, which forced the incumbents to move to fiber-to-the premises.

Continue reading →

Julian has some more smart things to say about network neutrality and the Pizza Hut analogy:

Suppose an ISP wants to build out infrastructure to support 100mbps service in an area that doesn’t currently have it. Problem: There are customers who might like that higher speed access for a few purposes, like streaming movies, but not enough who are prepared to pay the premium to upgrade their whole connection, so it’s not cost effective for the ISP to make the investment. One solution might be metering: You let customers pay for the bandwidth they use, paying a bit more for bursts of higher speed needed to access specific sites with a lower flat rate for the majority of the time, when they’re just reading news and checking email. The problem is that consumers seem to have largely rejected metering: People want to pay one rate for their access, and not have to think about their usage level on a day-to-day basis.

Continue reading →