Telecom & Cable Regulation

Major speed enhancements are rumored to be coming soon from Comcast, which has been spending serious cash to upgrade its network to the DOCSIS 3.0 standard. Customers in many markets who now pay $42.95 a month for 6mbps/1mbps service will be upgraded to 12/2 — a doubling of both upstream and downstream speeds — with no corresponding price increase. This follows Comcast’s pattern of enhancing speeds without hiking prices. And the price point of the standard tier has remained unchanged in nominal terms for several years, so when you factor in inflation, it’s fair to say Comcast has actually been dropping prices.

It’s amazing to consider how broadband speeds have evolved in a relatively short period of time. Comcast’s highest tier was a mere 4mbps/384kbps just four years ago, when DSL speeds typically topped out at 3/768. For consumers who live in a competitive ISP market, DSL now offers 20/1, Fiber offers 30/5, and Cable will soon offer 22/5. All of these tiers are priced under $100 per month.

Though we may not be amidst a “price war” among ISPs per se, as Mike Masnick recently argued, there is simply no denying that price per megabit is declining rapidly. This is all thanks to competition, of course, which has pushed providers to invest in newer technologies that allow for faster broadband connectivity.

Market skeptics will assuredly respond to my optimism by pointing out that so long as Comcast sticks with its 250GB monthly usage cap, consumers are really just getting the same service with shinier packaging. Yet that fact hardly means we should scoff at Comcast’s new performance tiers.

As I’ve discussed on several occasions, I churn through a lot of file transfers each month, so I’m all for Comcast raising its cap (or, alternatively, implementing reasonable overage fees). But even with Comcast’s fairly generous limits, who isn’t ecstatic about being able to download any file in half as much time as before? Caps will surely evolve over time as demand for 1080p content delivered over the Internet grows, but for now, speed is a bigger concern than usage for most consumers.

Back in the mid- and even late 1990s, I was engaged in a lot of dreadfully boring telecom policy debates in which the proponents of regulation flatly refused to accept the argument that the hegemony of wireline communications systems would ever be seriously challenged by wireless networks. Well, we all know how that story is playing out today. People are increasingly “cutting the cord” and opting to live a wireless-only existence. For example, this recent Nielsen Mobile study on wireless substitution reports that, although only 4.2% of homes were wireless-only at the end of 2003…

At the end of 2007, 16.4 percent of U.S. households had abandoned their landline phone for their wireless phone, but by the end of June 2008, just 6 months later, that number had increased to 17.1 percent. Overall, this percentage has grown by 3-4 percentage points per year, and the trend doesn’t seem to be slowing. In fact, a Q4 2007 study by Nielsen Mobile showed that an additional 5 percent of households indicated that they were “likely” to disconnect their landline service in the next 12 months, potentially increasing the overall percentage of wireless-only households to nearly 1 in 5 by year’s end.

And one wonders about how many homes are like mine — we just keep the landline for emergency purposes or to redirect phone spam to that number instead of giving out our mobile numbers.  Beyond that, my wife and I are pretty much wireless-only people and I’m sure there’s a lot of others like us out there.

Anyway, I’ve been having a strange feeling of deva vu lately as I’ve been engaging in policy debates about the future of the video marketplace.  Like those old telecom debates of the last decade, we are now witnessing a similar debate — and set of denials — playing out in the video arena.  Many lawmakers and regulatory advocates (and even some industry folks) are acting as if the old ways of doing business are the only ways that still count.  In reality, things are changing rapidly as video content continues to migrate online.

I was reminded of that again this weekend when I was reading Nick Wingfield’s brilliant piece in the Wall Street Journal entitled “Turn On, Tune Out, Click Here.”  It is must-reading for anyone following development in this field.  As Wingfield notes:

Continue reading →

Over at TechDirt, Tom Lee has a sharp critique of Muayyad Al-Chalabi’s much-circulated paper (via GigaOm) opposing bandwidth caps. Make sure to read Tom’s entire essay, but here’s the key take-away:

this whitepaper merely amounts to a complaint that a free lunch is ending. Bandwidth is clearly an increasingly limited resource. And in capitalist societies, money is how we allocate limited resources. The alternate solutions that Al-Chalabi proposes to the carriers on pages 6 and 8 — like P2P mirrors, improved service and “leveraging… existing relationships with content providers” — either assume that network improvements are free, would gut network neutrality, or are simply nonsense.

Indeed. But Tom generally agrees that “Comcast’s bandwidth cap is a drag” and that “Instead of disconnection, there should be reasonable fees imposed for overages. They should come up with a schedule defining how the cap will increase in the future. And the paper’s suggestion of loosened limits during off-peak times is a good one.”

Well, those are three different things but I generally agree with all of them. Let me just repeat, however, my strong endorsement of the first option — metering at the margin — and again highlight the optimal way to do it from an economic perspective. As I noted in one of my many previous articles about metering for bandwidth hogs:

Continue reading →

Note: Here’s a second post I just put live at DrewClark.com. It refers to an upcoming conference, on Friday, October 3, sponsored by the Information Economy Project at George Mason University School of Law. It will be held at 8:30 a.m. at the National Press Club. Registration details are below.

In the United States, the regulation of broadcast radio and television has always been done under a different standard than the regulation of the print medium.

As Secretary of Commerce in the administration of President Calvin Coolidge, Herbert Hoover declared: “The ether is a public medium, and its use must be for a public benefit,” he said at the Fourth National Radio Conference, in 1925. “The dominant element for consideration in the radio field is, and always will be, the great body of the listening public, millions in number, country-wide in distribution.”

When Congress created the Federal Radio Commission in 1927, it decreed that broadcasting was to serve the “public interest, convenience and necessity,” and this standard was re-affirmed in the Communications Act of 1934. Several Supreme Court decisions — albeit decisions that have been much criticized — affirmed that broadcasting could and should be treated differently than the traditional “press.”

This differential treatment for broadcasting — versus the print medium, and also cable television — was underscored by the decisions in Red Lion Broadcasting Co. v. FCC (1969), which upheld the “Fairness Doctrine,” and also FCC v. Pacifica Foundation (1978), which upheld indecency rules for over-the-air broadcast television. The Fairness Doctrine required broadcasters to grant reply time to those who said their views were criticized.

Continue reading →

Our conference, “Broadband Census for America,” is fast approaching…. The event is tomorrow. If you want to attend, follow the instructions in the press release below:

FOR IMMEDIATE RELEASE

WASHINGTON, September 25, 2008 – California Public Utilities Commissioner Rachelle Chong, a member of the Federal Communications Commission from 1994 to 1997, will kick off the Broadband Census for America Conference with a keynote speech on Friday, September 26, at 8:30 a.m.

Eamonn Confrey, the first secretary for information and communications policy at the Embassy of Ireland, will present the luncheon keynote at noon. Confrey will overview Ireland’s efforts to collect data on broadband service through a comprehensive web site with availability, pricing and speed data about carriers.

Following Chong’s keynote address, the Broadband Census for America Conference – the first of its kind to unite academics, state regulators, and entities collecting broadband data – will hear from two distinguished panels.

One panel, “Does America Need a Broadband Census?” will contrast competing approaches to broadband mapping. Art Brodsky, communication director of the advocacy group Public Knowledge, will appear at the first public forum with Mark McElroy, the chief operating officer of Connected Nation, a Bell- and cable-industry funded organization involved in broadband mapping.

Also participating on the panel will be Drew Clark, executive director of BroadbandCensus.com, a consumer-focused effort at broadband data collection; and Debbie Goldman, the coordinator of Speed Matters, which is run by the Communications Workers of America.

The second panel, “How Should America Conduct a Broadband Census?” will feature state experts, including Jane Smith Patterson, executive director of the e-NC authority; and Jeffrey Campbell, director of technology and communications policy for Cisco Systems. Campbell was actively involved in the California Broadband Task Force.

Others scheduled to speak include Professor Kenneth Flamm of the University of Texas at Austin; Dr. William Lehr of the Massachusetts Institute of Technology; Indiana Utility Regulatory Commissioner Larry Landis; and Jean Plymale of Virginia Tech’s eCorridors Program.

Keynote speaker Rachelle Chong has been engaged in broadband data collection as a federal regulator, as a telecommunications attorney, and since 2006 as a state official.

Chong was instrumental to the California Broadband Task Force, which mapped broadband availability in California. She will speak about broadband data collection from the mid-1990s to today.

The event will be held at the American Association for the Advancement of Sciences’ headquarters at 12th and H Streets NW (near Metro Center) in Washington.

For more information:
Drew Bennett, 202-580-8196
Bennett@broadbandcensus.com
Conference web site: http://broadbandcensus.com/conference/
Registration: http://broadbandcensus.eventbrite.com/


Scott Cleland has an unusually even-keeled post today (Where are the bullets and bolding, Scott?!) about how Google undermines its own policy arguments on net neutrality regulation by promoting more sources of broadband – in this case, satellite.

What has always mattered, of course, is getting more broadband platforms up and running. The debate over net neutrality regulation is a sideshow, and probably a detriment to communications progress as it casts a cloud of regulatory uncertainty over the industry. Higher costs, slower rollouts, and lower profits from uncertain regulations probably chills investment in any potential new broadband platform.

But I’m here to tell you, Scott, that even if Google helps put a couple more broadband platforms in place, the goalposts will move.

Today, I came across a letter sent by Senate Antitrust Subcommittee Chairman Herb Kohl (D-WI) asking the four major wireless providers why the price of text messaging has gone up. He says that the price has gone from 10 cents per message sent or received in 2005 to 20 cents on all four carriers.
Continue reading →

Writing at Slate, Tim Wu tries to make Obama out to be the real Big Government candidate on media policy, who will deliver “if not a chicken in every pot, a fiber-optic cable in every home.” By contrast, Wu implies that McCain is just another pro-big business lackey who doesn’t understand “that the media and information industries are special—that like the transportation, energy, or financial industries, they are deeply entwined with the public interest.” Wu goes on to say:

Ultimately, most of the difference in Obama’s and McCain’s media policies boils down to questions about whether the media is special and a dispute over how much to trust the private sector. Camp McCain would tend to leave the private sector alone, with faith that it will deliver to most Americans what they want and deserve. The Obama camp would probably administer a more frequent kick in the pants, in the belief that good behavior just isn’t always natural.

First, as a factual matter, Wu is just wrong about McCain being some sort of a radical hands-off, pro-market liberalizer on media policy issues. Oh, if only that were true! But for those of us who have been in DC covering telecom and media policy for many years, it is widely understood there is no nailing down John McCain on any tech, telecom or media policy issue. He’s been all over the board. While he has sponsored or supported some deregulatory initiatives on the telecom front in the past, he’s also been a supporter of other regulatory causes. His battles with broadcasters and cable, for example, are well-known. Most recently, McCain has been leading the effort to impose a la carte mandates on cable and satellite operators.
Continue reading →

I’ve had the opportunity to be involved in the planning and organization of several conferences this fall, including one exciting event, entitled “Consensus FCC Reforms and the Communications Agenda,” which I have organized in my capacity as Assistant Director of the Information Economy Project at George Mason University. Consensus FCC Reforms

You can read more details about the event at the Information Economy Project web site, but the basic gist is that, in spite of controversies swirling over issues such as Network Neutrality, media ownership and universal service, some policy observers believe that a range of reforms may attract bi-partisan consensus.  These opportunities may be more likely to be realized if identified prior to the November 2008 election.

We’ve been fortunate enough to have a stellar cast of participants, including two former chairmen of the Federal Communications Commission – William Kennard, who served under President Clinton, and Michael Powell, who served under President George W. Bush. Theyll be speaking about substantive issues for consensus, and their discussion will be moderated by Amy Schatz, a reporter for The Wall Street Journal.

But we’ll also be talking about procedural issues — questions of agency structure, rules, and the day-by-day practices and operations to do much to impact the telecom polity. That panel, which features chief staffers for almost all of the recent FCC chairmen, will be moderated by me.

Here’s the full program:.

8:30 a.m.         Welcome by Thomas W. Hazlett, Professor of Law and Economics, GMU

Panel I:           Improving Procedures at the Federal Communications Commission
8:40 a.m.
Peter Pitsch, chief of staff to Dennis Patrick, FCC Chairman, 1987-1989
Robert Pepper*, former chief, Office of Plans and Policy, FCC, 1989-2005
Ken Robinson, senior legal advisor to Al Sikes, FCC Chairman, 1989-1993
Blair Levin, chief of staff to Reed Hundt, FCC Chairman, 1993-1997
Kathy Brown, chief of staff to William Kennard, FCC Chairman, 1998-2001

Moderator: Drew Clark, Assistant Director, Information Economy Project

Panel II:          A Cross-Partisan Agenda for Telecommunications Policy Reforms
9:45 a.m.
William Kennard, Chairman, FCC, 1997-2001
Michael Powell, Chairman, FCC, 2001-2005

Moderator: Amy Schatz, Reporter, The Wall Street Journal

When: Tuesday, September 16, 2008, 8:30 a.m. – 11 a.m.
Where: National Press Club, 529 14th St. NW, 13th Floor, Washington, DC

Admission is free, but seating is limited. See IEP Web page: http://iep.gmu.edu.
To reserve your spot, please email Drew Clark: iep.gmu@gmail.com.

About the Information Economy Project:
The Information Economy Project at George Mason University sits at the intersection of academic research and public policy, producing peer-reviewed scholarly research, as well as hosting conferences and lectures with prominent thinkers in the Information Economy. The project brings the discipline of law and economics to telecommunications  policy. More information about the project is available at http://iep.gmu.edu.

After a little bit of suspense, Comcast today filed suit in federal court challenging the FCC’s authority to sanction it for “unreasonable network practices.” I say suspense because there was speculation that Comcast might have decided to look the other way and live with a decision that didn’t really force it to do much that the market hadn’t already made it do. I’m happy to see that they’re not standing for Kevin Martin’s blatant overreach. As I’ve said many times before, the FCC has no authority to punish a company for behaving “unreasonably” when it has never established a criteria for what is reasonable.

I don’t know to what statement specifically Saul Hansell is referring, but in his New York Times post breaking the news, he wrote:

Kevin Martin, the commission’s chairman, has argued that making rules in advance is not a good method to regulate fast-moving markets like Internet service. Under his stewardship, the commission has published broad principles and has taken action only when it found that objectionable practices have occurred.

I love that. Making laws before we apply them isn’t really efficient.

If you want every gory detail about why the FCC’s order should fall, I heartily recommend to you Barbara Esbin’s recent paper [PDF] on the matter. Esbin is a fourteen-year veteran of the FCC and, among other things, in her paper she explodes an argument that I’ve been hearing lately, namely that the FCC has “ancillary jurisdiction” to regulate broadband network management practices. She writes:

As Commissioner Adelstein stated: “[T]he Order sets out the Commission’s legal authority under Title I of the Act, explaining that preventing unreasonable network discrimination directly furthers the goal of making broadband Internet access both “rapid” and “efficient.” This appears to be a paraphrase of Section 1 of the Act, which recites the Act’s purposes and the reason for creation of the FCC, including “regulating interstate and foreign commerce in communication by wire and radio so as to make available . . . a rapid, efficient, Nation-wide and world-wide wire and radio communication service with adequate facilities at reasonable charges…” But because Title I is also considered the source of “ancillary jurisdiction,” that is akin to saying that the FCC can regulate if its actions are ancillary to its ancillary jurisdiction, and that is one ancillary too many.

Amen.

Cable as Moby DickWell, another four months have passed since I last asked this question, but let me pose it again: Where exactly is the FCC’s Video Competition Report and why is it taking so long to get it out the door? It wouldn’t have anything to do with a certain Chairman Ahab still trying to get his cable whale, would it? No, of course not. I’m sure there’s a perfectly rational reason that this 13th Annual report is now something like 18 months past due altogether. Right.

And keep in mind that the data in the 13th report is for a period ending on June 30, 2006, so whenever the report finally comes out the data in it will be well over two years old! That won’t exactly reflect the true state of the video programming market considering the significant changes we have since that time, especially the continued explosive growth of online video, VOD, and DVRs.

The reason that I have been making a big deal out of this issue is because this gets to the question of just how “scientific” and “independent” of an agency the FCC really is. We are talking about facts here. Basic data. This is stuff the FCC should be routinely collecting and reporting on a timely basis — indeed that is what Congress requires the agency to do in this specific case. And yet the agency can’t do it because its Chairman is on this Moby Dick-like crusade against the cable industry. By the time this 13th annual report finally sees the light of day, the 15th annual report might be due! Outrageous. (And you wonder why many of us here are so skeptical about empowering the FCC regulating the Internet via Net neutrality mandates! If an over-zealous Chairman can politicize this issue, just think what might happen once we give the agency the authority to regulate the Net.)

Anyway, down below you will find the paper that Barbara Esbin and I wrote about the issue four months ago. Perhaps we should place a little ticker somewhere here on the site that counts each day that passes as we wait for the Commission to produce this report. We can take bets on when the agency’s data holdout will end.
Continue reading →