Broadband & Neutrality Regulation

My colleague Barbara Esbin, a Senior Fellow and Director of the Center for Communications and Competition Policy at The Progress & Freedom Foundation, was asked to pen a short history of the net neutrality wars in the U.S. for a French publication, La Lettre de l’Autorité.  Her essay provides an excellent, concise overview of where we’ve come from and where we might be heading on this front.  I’ve pasted the entire essay down below, or you can download the PDF here.


Net Neutrality Regulation in the United States by Barbara Esbin

PFF Progress Snapshot Release 4.21 October 2008

The United States moved closer to “Net Neutrality” regulation this year when the Federal Communications Commission found that Comcast, a cable broadband Internet service provider, violated a set of Internet policy principles the FCC adopted in 2005 by limiting peer-to-peer (P2P) traffic. The ruling was the culmination of a ten-year effort that began as a call for wholesale “open access” to the cable platform for third-party Internet service providers. Requests for open access first emerged in 1998 when the FCC considered AT&T’s acquisition of cable operator TCI. The FCC rejected open access, but the issue quickly re-emerged in a subsequent proceeding to determine the appropriate regulatory classification of cable Internet service. Depending on how the FCC categorized cable Internet service, it would either be subject to telecommunications “common carrier” requirements, “cable service” requirements, or treated as a then-unregulated “information service.”

In 2002, the FCC classified cable Internet service as an “information service.” This meant that the telecommunications common carrier requirements — that service be provided upon request, without unreasonable discrimination as to rates, terms and conditions of service — would not apply to cable Internet services. The FCC’s decision was upheld by the U.S. Supreme Court in NCTA v. Brand X. Afterwards, advocates of open access re-directed their efforts away from advocating wholesale access for third-party ISPs, and towards rules aimed at consumer rights to a “neutral network” or “net neutrality.”

Continue reading →

Back in the mid- and even late 1990s, I was engaged in a lot of dreadfully boring telecom policy debates in which the proponents of regulation flatly refused to accept the argument that the hegemony of wireline communications systems would ever be seriously challenged by wireless networks. Well, we all know how that story is playing out today. People are increasingly “cutting the cord” and opting to live a wireless-only existence. For example, this recent Nielsen Mobile study on wireless substitution reports that, although only 4.2% of homes were wireless-only at the end of 2003…

At the end of 2007, 16.4 percent of U.S. households had abandoned their landline phone for their wireless phone, but by the end of June 2008, just 6 months later, that number had increased to 17.1 percent. Overall, this percentage has grown by 3-4 percentage points per year, and the trend doesn’t seem to be slowing. In fact, a Q4 2007 study by Nielsen Mobile showed that an additional 5 percent of households indicated that they were “likely” to disconnect their landline service in the next 12 months, potentially increasing the overall percentage of wireless-only households to nearly 1 in 5 by year’s end.

And one wonders about how many homes are like mine — we just keep the landline for emergency purposes or to redirect phone spam to that number instead of giving out our mobile numbers.  Beyond that, my wife and I are pretty much wireless-only people and I’m sure there’s a lot of others like us out there.

Anyway, I’ve been having a strange feeling of deva vu lately as I’ve been engaging in policy debates about the future of the video marketplace.  Like those old telecom debates of the last decade, we are now witnessing a similar debate — and set of denials — playing out in the video arena.  Many lawmakers and regulatory advocates (and even some industry folks) are acting as if the old ways of doing business are the only ways that still count.  In reality, things are changing rapidly as video content continues to migrate online.

I was reminded of that again this weekend when I was reading Nick Wingfield’s brilliant piece in the Wall Street Journal entitled “Turn On, Tune Out, Click Here.”  It is must-reading for anyone following development in this field.  As Wingfield notes:

Continue reading →

Over at TechDirt, Tom Lee has a sharp critique of Muayyad Al-Chalabi’s much-circulated paper (via GigaOm) opposing bandwidth caps. Make sure to read Tom’s entire essay, but here’s the key take-away:

this whitepaper merely amounts to a complaint that a free lunch is ending. Bandwidth is clearly an increasingly limited resource. And in capitalist societies, money is how we allocate limited resources. The alternate solutions that Al-Chalabi proposes to the carriers on pages 6 and 8 — like P2P mirrors, improved service and “leveraging… existing relationships with content providers” — either assume that network improvements are free, would gut network neutrality, or are simply nonsense.

Indeed. But Tom generally agrees that “Comcast’s bandwidth cap is a drag” and that “Instead of disconnection, there should be reasonable fees imposed for overages. They should come up with a schedule defining how the cap will increase in the future. And the paper’s suggestion of loosened limits during off-peak times is a good one.”

Well, those are three different things but I generally agree with all of them. Let me just repeat, however, my strong endorsement of the first option — metering at the margin — and again highlight the optimal way to do it from an economic perspective. As I noted in one of my many previous articles about metering for bandwidth hogs:

Continue reading →

http://penny-arcade.com/comic/2008/9/26/

Speaking of snakes, I am just returned from a camping trip along the Appalachian trail in the Michaux Forest, quite out of wireless reception range. Several days’ heavy rain had washed the forest clean, left the moss glowing green and the mushrooms, salamanders, crayfish, and frogs quite content. There one combats the same problems confronted by earlier settlers–mice (and the snakes they attract), staying dry and tolerably warm, the production of decent meals, and keeping small children from wandering off into the woods. Why do some people enjoy briefly returning to this world? Despite being one of those people, I can’t say. Now I am back and my day is easy and comfortable (comparatively), with time to spare contemplating the meta-structures of finance, property, and capital. Let’s all hope these structures are not nearly as fragile as our confidence in them, which, judging from the tone of remarks at last week’s ITIF conference on innovation, has fallen quite low. Continue reading →

Note: Here’s a post I just put live at DrewClark.com. It refers to an upcoming conference that might be of interest to Tech Liberation readers. Make sure to follow the link to the bottom of the post for registration information for this FREE conference, to be held tomorrow, Friday, October 3, at 8:30 a.m.

If all goes according to plan, on February 17, 2009, television broadcasters will power down their analog transmitters. They will be broadcasting their signal only digitally.

After more than 20 years in the long transition to digital television, this might be considered progress. Now, millions of Americans are collecting vouchers from the Commerce Department to subsidize their purchase of converter boxes. These are the electronic devices that take the digital signals — and convert them back to analog — so that viewers without high-definition televisions can watch broadcast TV on their old sets.

What about the bigger questions? Is there any benefit to the public, or to consumers, from the transition to digital television? What about the vaunted visions of hundreds of broadcast channels, through multi-casting? What would be the new public-interest obligations, if any, of broadcasters? This question has definitely not been resolved.

Continue reading →

[Not sure if someone else has mentioned this here yet, but… ] There’s a terrific piece by Paul Korzeniowski in Forbes this week about the Comcast-BitTorrent debacle called, “Feds and Internet Service Providers Don’t Mix.”  It’s well worth reading the whole thing, but I particularly like this passage:

For whatever reason, some believe ISPs should not be able to put any restrictions on the volume of information that any user transmits. That’s absurd. Per-bit and per-byte pricing models have long been used for data transmissions. In trying to build and sustain their businesses, carriers constantly balance their attractiveness and viability versus unlimited usage pricing models. By government decree, they no longer have that option. In effect, the FCC has decided to tell ISPs how to run their networks. A related issue is Comcast’s reluctance to disclose its network management processes. The reason seems obvious. Carriers spend literally billions of dollars installing and fine-tuning their networks each year. If they can move traffic more efficiently from one location to the next than their competitors, it translates to a more profitable bottom line.  But network neutrality advocates maintain that Comcast has an obligation to open its network operation to the world. Why not have Kentucky Fried Chicken publish its original recipe or Coca-Cola tell us how it makes soft drinks?

Exactly. It gets back to a point I stressed in one of our podcasts on this issue about how “transparency” regulations are great in theory but in practice might have some rather profound implications.  More generally, there’s just the fact that it further puts the camel’s nose in the Internet tent by inviting regulators in to meddle more in the name of “transparency.”

As always, Richard Bennett has far more interesting things to say about the issue than me. Check out his essay about this same Forbes piece over at Circle ID.

In my nearly 17 years of public policy work, I have never felt so vindicated about something as I did this weekend when I read Dan P. Lee’s Philadelphia magazine feature on “Whiffing on Wi-Fi.” It is a spectacularly well-written piece about the spectacular failure of Philadelphia’s short-lived experiment with municipally-subsidized wi-fi, which was called Wireless Philadelphia.  You see, back in April 2005, I wrote a white paper entitled “Risky Business: Philadelphia’s Plan for Providing Wi-Fi Service,” and it began with the following question: “Should taxpayers finance government entry into an increasingly competitive , but technologically volatile, business market?”  In the report, I highlighted the significant risks involved here in light of how rapidly broadband technology and the marketplace was evolving. Moreover, I pointed to the dismal track record of previous municipal experiments in this field, which almost without exception ended in failure. I went on to argue:

Keeping these facts in mind, it hardly makes sense for municipal governments to assume the significant risks involved in becoming a player in the broadband marketplace. Even an investment in wi-fi along the lines of what Philadelphia is proposing, is a risky roll of the dice. [… ]  the nagging “problem” of technological change is especially acute for municipal entities operating in a dynamic marketplace like broadband. Their unwillingness or inability to adapt to technological change could leave their communities with rapidly outmoded networks, and leave taxpayers footing the bill.

I got a stunning amount of hate mail and cranky calls from people after I released this paper.  Everyone accused me of being a sock puppet for incumbent broadband providers or just not understanding the importance of the endevour.  But as I told everyone at the time, I wasn’t out to block Philadelphia from conducting this experiment, I just didn’t think it had any chance of being successful.  And, again, I tried to point out what a shame it would be if taxpayers were somehow stuck picking up the tab, or if other providers decided not to invest in the market because they were “crowded-out” by government investment in the field.

But even I could have never imagined how quickly the whole house of cards would come crumbling down in Philadelphia.  It really was an astonishing meltdown.  Dan Lee’s article makes that abundantly clear:

Continue reading →

Our conference, “Broadband Census for America,” is fast approaching…. The event is tomorrow. If you want to attend, follow the instructions in the press release below:

FOR IMMEDIATE RELEASE

WASHINGTON, September 25, 2008 – California Public Utilities Commissioner Rachelle Chong, a member of the Federal Communications Commission from 1994 to 1997, will kick off the Broadband Census for America Conference with a keynote speech on Friday, September 26, at 8:30 a.m.

Eamonn Confrey, the first secretary for information and communications policy at the Embassy of Ireland, will present the luncheon keynote at noon. Confrey will overview Ireland’s efforts to collect data on broadband service through a comprehensive web site with availability, pricing and speed data about carriers.

Following Chong’s keynote address, the Broadband Census for America Conference – the first of its kind to unite academics, state regulators, and entities collecting broadband data – will hear from two distinguished panels.

One panel, “Does America Need a Broadband Census?” will contrast competing approaches to broadband mapping. Art Brodsky, communication director of the advocacy group Public Knowledge, will appear at the first public forum with Mark McElroy, the chief operating officer of Connected Nation, a Bell- and cable-industry funded organization involved in broadband mapping.

Also participating on the panel will be Drew Clark, executive director of BroadbandCensus.com, a consumer-focused effort at broadband data collection; and Debbie Goldman, the coordinator of Speed Matters, which is run by the Communications Workers of America.

The second panel, “How Should America Conduct a Broadband Census?” will feature state experts, including Jane Smith Patterson, executive director of the e-NC authority; and Jeffrey Campbell, director of technology and communications policy for Cisco Systems. Campbell was actively involved in the California Broadband Task Force.

Others scheduled to speak include Professor Kenneth Flamm of the University of Texas at Austin; Dr. William Lehr of the Massachusetts Institute of Technology; Indiana Utility Regulatory Commissioner Larry Landis; and Jean Plymale of Virginia Tech’s eCorridors Program.

Keynote speaker Rachelle Chong has been engaged in broadband data collection as a federal regulator, as a telecommunications attorney, and since 2006 as a state official.

Chong was instrumental to the California Broadband Task Force, which mapped broadband availability in California. She will speak about broadband data collection from the mid-1990s to today.

The event will be held at the American Association for the Advancement of Sciences’ headquarters at 12th and H Streets NW (near Metro Center) in Washington.

For more information: Drew Bennett, 202-580-8196 Bennett@broadbandcensus.com Conference web site: http://broadbandcensus.com/conference/ Registration: http://broadbandcensus.eventbrite.com/

“Buzz Out Loud,” one of my favorite podcasts, disappoints me from time to time, specifically when the good folks at CNET decide to bash broadband companies and call them “jerks” and “evil.”

So goes Episode 809 of Buzz Out Loud.  Molly Wood, Jason Howell, and guest host Don Reisinger declare AT&T’s decision to throttle U-Verse (as reported by Ars-Technica) to be just another dumb thing that stupid broadband companies do.

One of their reasons for saying so is that AT&T’s U-Verse is fiber, but that’s not true.  U-Verse uses fiber to feed VRADs, or Video Ready Access Devices, that take that fiber and feed its signal out over legacy copper wires, in a sort of DSL adapted-to-video hybrid.

When you get the facts wrong, your analysis is bound to be bad.

Continue reading →

The introduction below was originally written by Berin Szoka, but now that I (Adam Marcus) am a full-fledged TLF member, I have taken authorship.


Adam Marcus, our exceptionally tech-savvy new research assistant at PFF, has published his first piece at the PFF blog, which I reprint here for your edification.

Today Google’s DC office hosted an interesting panel on cloud computing.  What was missing was a good definition of what “cloud computing” actually is.

While Wikipedia has its own broad definition of cloud computing, many think of cloud computing more narrowly as strictly web-based for which clients need nothing but a web browser. But that definition doesn’t cover things like Skype and SETI@home.  And just because PFF has implemented Outlook Web Access so we can access the Exchange server via the Web, doesn’t necessarily mean we’ve implemented what most people might think of as “cloud computing.”  Yet these are all variations on a common theme, which leads me to propose my own basic definition: any client/server system that operates over the Internet.

To understand the potential policy and legal issues raised by cloud computing so-defined, one must break down the discussion into a 4-part grid.  One axis is divided into private data ( e.g., email) and public data (e.g., photo sharing).  The other axis is divided into data hosted on a single server or centralized server farm and data hosted on multiple computers in a dynamic peer-to-peer network (e.g., BitTorrent file sharing).

Examples User Data is Public User Data is Private
Centralized Server(s) Blogs Discussion boards Flickr Web-based email servers Windows Terminal Services
Peer-to-Peer BitTorrent FreeNet (article) Skype Wuala

Continue reading →