Posts tagged as:

After working my way through the Executive Summary of the Federal Communications Commission’s (FCC) National Broadband Plan, there are a number of things I find troubling that I will get to in a subsequent post. But here’s the thing about “The Plan” that I found most surprising — even audacious — in its arrogance: The FCC wants us to believe the whole scheme is costless. The agency bases this astonishing claim on the following assumptions:

Given the plan’s goal of freeing 500 megahertz of spectrum, future wireless auctions mean the overall plan will be revenue neutral, if not revenue positive.  The vast majority of recommendations do not require new government funding; rather, they seek to drive improvements in the government efficiency, streamline processes and encourage private activity to promote consumer welfare and national priorities. The funding requests relate to public safety, deployment to unserved areas and adoption efforts. If the spectrum auction recommendations are implemented, the plan is likely to offset the potential costs.

Let me translate: ” Pay no attention to all the bills we are racking up, because spectrum revenues shall set us free!”

Perhaps that logic works in the reality-free zone we call the Beltway, but back in the real world this simply doesn’t add up. Regardless of how well-intentioned any of these goals and proposals may be, it should be equally clear that there is no free lunch, even with spectrum auction proceeds fueling the high-tech gravy train. The proposals and programs the FCC sets forth will impose serious economic costs that shouldn’t be so casually dismissed, especially using the weak reasoning that “improvements in the government efficiency” will magically manifest themselves thanks to massive new government intervention in the field. (If you think you’ve heard this one before, you have. See: The current health care debate.)

Moreover, if everything really does hang on the promise of spectrum auction revenues covering the broadband spending binge, well, bad news: The agency is never going to bring in enough to cover what they’ve proposed here. The reason is simple: Most of the spectrum they want to grab is currently occupied by someone else! Continue reading →

The FCC today released an executive summary of its National Broadband Plan, which is supposed to be delivered to Congress tomorrow.  Of course, executive summaries by their nature are brief and usually don’t explain the underlying logic and evidence supporting the conclusions. Here are a few highlights, some possible interpretations, and things to look for when the full plan gets released tomorrow:

Recommendation: “Undertake a comprehensive review of wholesale competition rules to help ensure competition in fixed and mobile broadband.” This could signal that the FCC plans to re-impose “unbundling” or “line sharing” regulations, which would require broadband companies to let competitors use their lines and other facilities at regulated rates. Such initiatives would likely undermine broadband deployment and investment.  Economic research by my GMU colleague Tom Hazlett and others finds that broadband investment, competition, deployment in the US took off only after the FCC eliminated line-sharing requirements. Christina Forsberg and I summarized a lot of this research here.

Recommendation: “Make 500 Mhz of spectrum available for broadband within ten years … Enable incentives and mechanisms to repurpose spectrum.” This is a fantastic recommendation. A Mercatus Center review of the costs of federal telecommunications regulations found that federal spectrum allocation, which prevents spectrum from being reallocated to uses that consumers value highly (like broadband), is by far the costliest federal regulation affecting telecom and the Internet. This recommendation indicates the FCC leadership would like to auction a lot more spectrum and share the proceeds with existing users (like broadcasters) in order to overcome resistance to reallocation. It’s not quite a market in spectrum, but it might be the closest the FCC can come.

Recommendation: “Broaden the USF contribution base to ensure USF remains sustainable over time.” Uh-oh. I’m not sure what this means, but if means that broadband subscribers will have to start payng into the FCC’s universal service fund (USF), watch out! Most economic studies find that consumer demand for broadband is very price-sensitive. That means if the FCC slaps broadband with universal service fees (which currently exceed 10 percent), we’ll see a big drop in broadband subscribership — maybe by 4-7 million subscribers. This is , of course, precisely the opposite of what the FCC wants to accomplish!

Recommendation: “Reform intercarrier compensation, which provides implicit subsidies to telephone companies by eliminating per minute charges over the next ten years…” Another excellent idea.  “Intercarrier compensation” refers to payments phone companies make when they hand traffic off to each other. Small, rural phone companies usually receive the highest per minute payments — as much as 15-30 cents per minute! This is a huge markup on long-distance phone service — another price-sensitive service!

Recommendation: Provide subsidies so that rural areas can have broadband with download speeds of 4 MB.  It will be interesting to read in the full plan where this 4 MB figure came from. Does it reflect the speed of service that a lot of Americans currently have, so these subsidies are just supposed to help equalize opportunities for rural residents? Or does it reflect some balancing of the costs and benefits of subsidizing broadband in rural areas?  Or is this a magic number experts believe subscribers need, regardless of the choices consumers actually make in the marketplace and regardless of what it costs?

The executive summary also lists a set of goals, such as ensuring that every American has the ability to subscribe to “robust” broadband service, having 100 million households with access to 100 MB broadband, and ensuring that the US has the fastest and most extensive wireless networks of any nation.  When the full plan comes out, look carefully at whether or how the FCC plans to measure accomplishment of these goals.  More importantly, look to see whether the FCC explains how it will quantify how much its own policies actually contribute to these goals over time. The FCC is famous for NOT doing these kinds of things, so let’s see if the broadband plan signals a new era in accountability.

Details are starting to trickle out about the Federal Communications Commission’s (FCC) National Broadband Plan, which is due out tomorrow. Someone just posted the Executive Summary here. I haven’t had a chance to go through it all yet, but I’m looking forward to learning more about what the agency’s plans are on this front.

On Friday (again, before seeing any details), I offered some fairly mushy comments about the idea of national “plan” to the gang over at the excellent new site, FiveQsOnTech.com.  The site has a great format: Five questions on technology and policy asked and answered (usually on tape) by technology policy wonks. I’m honored to be among the first couple of experts featured on the site, along with Markham Erickson of the Open Internet Coalition and Rob Atkinson of ITIF.

In the first 3 minutes of this second of the two videos I appear in, I offered some thoughts about “The Plan”:

http://www.youtube.com/v/sBRL2RfdMk4&rel=0&color1=0xb1b1b1&color2=0xcfcfcf&hl=en_US&feature=player_embedded&fs=1

I published an opinion piece today for CNET arguing against recent calls to reclassify broadband Internet as a “telecommunications service” under Title II of the Communications Act.

The push to do so comes as supporters of the FCC’s proposed Net Neutrality rules fear that the agency’s authority to adopt them under its so-called “ancillary jurisdiction” won’t fly in the courts.  In January, the U.S. Court of Appeals for the D.C. Circuit heard arguments in Comcast’s appeal of sanctions levied against the cable company for violations of the neutrality principles (not yet adopted under a formal rulemaking).  The three-judge panel expressed considerable doubt about the FCC’s jurisdiction in issuing the sanctions during oral arguments.  Only the published opinion (forthcoming) will matter, of course, but anxiety is growing.

Solving the Net Neutrality jurisdiction problem with a return to Title II regulation is a staggeringly bad idea, and a counter-productive one at that.  My article describes the parallel developments in “telecommunications services” and the largely unregulated “information services” (aka Title I) since the 1996 Communications Act, making the point that life for consumers has been far more exciting—and has generated far more wealth–under the latter than the former.

Under Title I, in short, we’ve had the Internet revolution.  Under Title II, we’ve had the decline and fall of basic wireline phone service, boom and bust in the arbitraging competitive local exchange market, massive fraud in the bloated e-Rate program, and the continued corruption of local licensing authorities holding applications hostage for legal and illegal bribes.

Continue reading →

I was slow to adopt broadband. So maybe it’s also appropriate that I was slow to read John Horrigan’s highly informative survey on broadband adoption released by the Federal Communications Commission on February 23. Or maybe it’s fortuitous, because the delay let me take a look to see what messages the news media took away from this survey.

Two clear messages appear in the news coverage.  The first is a variant of the screaming headline the FCC put on its own press release: “93 Million Americans Disconnected from Broadband Opportunities.” You’ll find this as the headline or lead paragraph in coverage by the New York Times and AFP.

The second type of message highlights the main reasons one-third of the population does not subscribe to broadband. “FCC Survey Shows Need to Teach Broadband Basics,” notes the headline on an Associated Press story. According to the survey, the three main obstacles to broadband adoption are cost, lack of digital literacy, and non-adopters’ perception that broadband is not sufficiently relevant to their lives.  (I got a chuckle when I saw that non-adopters said they would be willing to pay $25, on average, for broadband; that’s the magic price that finally induced me to give in and sign up!)

But whoa, what’s missing here?  Our old friend Availability. Broadband was supposed to be some kind of noveau public works project that would take hundreds of billions of dollars to bring to fruition, because many Americans lack access to broadband. “Build it and they will come!” “Pour that concrete information superhighway!” “Stimulate the economy!”

The FCC survey tells an interesting story about availability:

Of the … non-adopters, 12 percent say they cannot get broadband where they live. This translates into a 4 percent share of Americans—on the basis of their reports on infrastructure availability in their neighborhood—who say they are unable to obtain broadband because it is not available. This means that 31 percent of all Americans can get service but do not. (p. 5)

The survey also notes that 10 percent of rural respondents say broadband is not available where they live.  I don’t mean to sound insensitive, but that’s all?  Heck, I’d have guessed a higher percentage than that.   

To put the numbers in perspective: 4 percent of Americans say they don’t have broadband because it isn’t available, while almost three times as many — 10 percent — lack broadband because they think the Internet is irrelevant to their lives.

Is availability a problem in some places?  Sure. But the FCC survey shows it isn’t nearly the size of problem we’d been led to believe. So let’s hope the National Broadband Plan’s discussion of availability is similary circumscribed and appropriately targeted.

Today I am testifying at an FCC hearing on “Serving the Public Interest in the Digital Era.” [Speaker lineup here.] The purpose of the workshop is to explore:

  • A brief history and overview of policies involving “public interest” requirements for commercial media and telecommunications companies;
  • The state of local commercial broadcast TV and radio news and information; and
  • The impact of media convergence and the emergence of the Internet, mobile technologies, and digital media on FCC media policy.

In my remarks, I focused on “Why Expansion of the FCC’s Public Interest Regulatory Regime is Unwise, Unneeded, Unconstitutional, and Unenforceable.” Down below I have attached my written remarks. Continue reading →

Very cool little video here by Jess3 documenting Internet growth and activity. Ironically, Berin sent it to me as Adam Marcus and I were updating the lengthy list of Net & online media stats you’ll find down below. Many of the stats we were compiling are shown in the video. Enjoy!

http://vimeo.com/moogaloop.swf?clip_id=9641036&server=vimeo.com&show_title=1&show_byline=1&show_portrait=1&color=ffffff&fullscreen=1
  • 1.73 billion Internet users worldwide as of Sept 2009; an 18% increase from the previous year. [1]
  • 81.8 million .COM domain names at the end of 2009; 12.3 million .NET names & 7.8 million .ORG names. [2]
  • 234 million websites as of Dec 2009; 47 million were added in 2009. [3] In 2006, Internet users in the United States viewed an average of 120.5 Web pages each day. [4]
  • There are roughly 26 million blogs on the Internet [5] and even back in 2007, there were over 1.5 million new blog posts every day (17 posts per second). [6] Continue reading →

White House cybersecurity chief Mike McConnell had a 1,400-word piece in the Washington Post on Sunday in which he stressed a public-private partnership as the key to a robust cyber-defense. One paragraph caught my attention, though:

We need to develop an early-warning system to monitor cyberspace, identify intrusions and locate the source of attacks with a trail of evidence that can support diplomatic, military and legal options — and we must be able to do this in milliseconds. More specifically, we need to reengineer the Internet to make attribution, geolocation, intelligence analysis and impact assessment — who did it, from where, why and what was the result — more manageable. The technologies are already available from public and private sources and can be further developed if we have the will to build them into our systems and to work with our allies and trading partners so they will do the same.

I’m not sure what he’s talking about, and I’d love if a knowledgeable reader would chime in. I’m not sure how such a spoof-proof geolocation system would work without a complete overhaul of how the internet works.

Every Tuesday, Washington, DC’s local NPR station (88.5 WAMU) carries a “Tech Tuesdays” program as a regular part of The Kojo Nnamdi Show.  This week’s show, which was guest hosted by Marc Fisher of the Washington Post, was on “Regulating the World Wide Web: A View from Abroad.” It was a wide-ranging and very interesting discussion about the future of Internet governance and regulation, featuring:

  • Evgeny Morozov: Yahoo! Fellow at the Institute for the Study of Diplomacy at Georgetown University; Fellow, Open Society Institute; and author “Net Effect” blog on ForeignPolicy.com
  • John Morris: General Counsel, Director of the Internet Standards, Technology and Policy Project, Center for Democracy and Technology
  • Olivier Tesquet: Reporter, Slate.fr (France)

Listen here. It’s worth your time.

Debate over the regulatory status of broadband heated up this week as trade associations and major broadband companies sent a letter to the Federal Communications Commission arguing strenuously against reclassification of broadband as “telecommunications service” subject to regulation under Title II of the Communications Act. One implication of Title II regulation is that broadband could be regulated like a public utility. Comparisons of broadband to services like electricity or railroads, which I discussed last week, also raise the prospect of public utility regulation. 

Classic public utility regulation restricts entry and regulates prices to prevent firms from charging excessive prices.  It’s typically used in situations where competition is believed to be impossible (or, where pre-existing policy decisions have created monopolies that aren’t going to go away very soon).

Broadband is not a monopoly; it is an oligopoly. Contrary to popular perception, that is not synonymous with “evil.” Although both monopoly and oligopoly end in “-opoly,” that doesn’t mean broadband competitors will charge monopoly prices, or even somewhat excessive prices.  The only firm conclusion that emerges from economic literature on oligopoly is, “anything’s possible, depending on the specific facts and circumstances.”

But there are also firm conclusions that emerge from economic literature on public utility regulation.  Just about every time the federal government has tried to impose public utility regulation on an oligopoly, it has ended up enforcing a cartel.  This is what happened in the past with railroads, trucking, airlines, and brokerage firms. There are a few times federal price regulation did not enforce cartels in oligopolistic or competitive industries. In those cases, it usually created shortages  — most notably gasoline and natural gas in the 1970s.

Title II regulation is not necessarily synonymous with public utility regulation. Title II could be used to impose some “nondiscrimination” requirements, without necessarily directly regulating broadband providers’ prices or profits.

But anyone who actually wants the FCC to regulate broadband providers’ prices and profits needs to read the peer-reviewed economics literature on the actual effects of public utility regulation in practice on the federal level. (More literature is cited here.) Then they need to explain why the results in broadband would be different.  And the explanation needs to be better than “We know better now, we’re smart, and we promise.”