Antitrust & Competition Policy

Barbara Esbin and I have just released a short PFF essay asking the question: “Where is the FCC’s Annual Video Competition Report?” The FCC is required to produce this report annually and yet the last one is well over a year past due and the data is contains will be over two years old by the time it comes out. I’ve embedded our paper about this below.

JZ

Well, I actually didn’t exactly get a chance to say quite enough for this to qualify as much of a “debate,” but I was brought in roughly a half hour into this WBUR (Boston NPR affiliate) radio show featuring Jonathan Zittrain, author of the recently released: The Future of the Internet–And How to Stop It. Jonathan was kind enough to suggest to the producers that I might make a good respondent to push back a bit in opposition to the thesis set forth in his new book.

Jonathan starts about 6 minutes into the show and they bring me in around 29 minutes in. Although I only got about 10 minutes to push back, I thought the show’s host Tom Ashbrook did an excellent job raising many of the same questions I do in my 3-part review (Part 1, 2, 3) of Jonathan’s provocative book.

In the show, I stress the same basic points I made in those reviews: (1) he seems to be over-stating things quite a bit in saying that the old “generative” Internet is “dying”; and in doing so, (2) he creates a false choice of possible futures from which we must choose. What I mean by false choice is that Jonathan doesn’t seem to believe a hybrid future is possible or desirable. I see no reason why we can’t have the best of both worlds–-a world full of plenty of tethered appliances, but also plenty of generativity and openness.

If you’re interested, listen in.

In case you haven’t read about it in a newspaper yet,  The Heritage Foundation this week released a new paper of mine on the FCC’s new newspaper cross-ownership rule and congressional efforts to “disapprove” the changes.  I argue that the 21st century hasn’t been kind to the newspaper.  As I’ve pointed out before (here and here) newspapers just aren’t the powerhouse they once were: few citizens today get their first or last news of the day from a bundle of paper tossed in the azaleas by a teenager on a bicycle.

Bottom line:  not only are the FCC’s changes justified, but the agency didn’t go nearly far enough. 

Here’s the full piece.

As Hance discussed last Thursday, the FCC will soon rule on AT&T’s petition for regulatory forbearance. Over at Openmarket.org I blog about why the FCC should grant phone companies relief from costly reporting requirements:

America’s two largest phone companies, AT&T and Verizon, recently filed forbearance petitions asking the FCC for relief from various regulations. Verizon is asking for the freedom to set prices on wholesale connections to competitive local carriers, and AT&T has requested exemption from certain FCC audit requirements and service quality reporting mandates.  

The real question is, why should Verizon have to ask permission from bureaucrats to decide how much to charge for its products? And why must AT&T spend millions of dollars to fill out intricate paperwork just to prove to the FCC its product is good enough for customers? 

Interventionists say this is because phone companies won’t ensure service quality unless they are subject to government oversight. But this claim ignores market conditions. With competition intensifying between phone providers and new wireless networks on the verge of completion, the market will discipline any communications company that skimps on service or price. Sprint and Comcast have learned this lesson the hard way.  

Continue reading →

There’s been quite a bit of discussion on this forum recently about whether vacant television channels — also called “white spaces” — should be licensed or unlicensed. Currently, of course, most of us experience Wi-Fi as a form of unlicensed wireless, as in the 2.4 Gigahertz bands.

Last year, I wrote about the issue of white spaces, mainly in the context of the National Association of Broadcasters:

…[B]roadcasters lost the spectrum wars – or at least the first spectrum war of the 21st Century. In early 2006, Congress said enough: broadcasters weren’t effectively using channels 52 to 69, and certainly wouldn’t need them after the transition to digital television (DTV) was completed. Television stations will be forced off those channels, corresponding to 698-806 MHz, on February 17, 2009.

That’s 700 MHz. But what about 500 MHz and 600 MHz? All told, there are 294 MHz of frequencies that broadcasters will continue to occupy ever after the DTV switchover. If more than 85 percent of Americans receive television from cable or satellite, as they do, what sense does it make to reserve these choice frequencies for broadcasters’ exclusive use?

Not very much. [more…]

Now, the broadcasters are basically out of the picture, and the battle is shaping up more pointedly: the wireless carriers in the wireless association formerly known as Cellular Telecommunications and Internet Association, and the high-tech titans like Dell, Google, Microsoft, Philips, etc.

Let’s take a step back from the current debate, though.

All of these unlicensed wireless devices in common use today were largely illegal until significant changes were enacted by the Federal Communications Commission the mid-1980s.

While these policy measures unleashing unlicensed have remained largely in the shadows, they’ll be the subject of a half-day conference at the Information Economy Project, at George Mason University School of Law, on Friday, April 4. More information is available at http://iep.gmu.edu.

Continue reading →

Interesting piece today by equity analyst Scott Berry on a site called seekingalpha.com. Addressing the ongoing Comcast-Bittorrent imbroglio, he argues that ISPs are have promised an all-you-can-eat “broadband salad bar” at a fixed price. But that the model, he points out is breaking down due to increased demand. As he puts it:

…the growth of video is stealing the condiments, and file sharers are sneezing in the salad”.

ISPs have a choice, he says: “Limit how many trips each patron can make for salad. Or charge them for each trip. Comcast has tried the former. My bet is on the latter.”

Not very appetizing, but well put. Once again, it turns out there is no free lunch. Or at least one that hasn’t been sneezed into.

Is increasing use of video likely to cause Internet delays? The New York Times today floats the theory that it might be.

But at least the article generously quotes a leading skeptic: Andrew Odlyzko, a Professor of Mathematics and Director of the Digital Technology Center and the Minnesota Supercomputing Institute at the University of Minnesota.

[Odlyzko] estimates that digital traffic on the global network is growing about 50 percent a year, in line with a recent analysis by Cisco Systems, the big network equipment maker.

That sounds like a daunting rate of growth. Yet the technology for handling Internet traffic is advancing at an impressive pace as well. The router computers for relaying data get faster, fiber optic transmission gets better and software for juggling data packets gets smarter.

“The 50 percent growth is high. It’s huge, but it basically corresponds to the improvements that technology is giving us,” said Professor Odlyzko, a former AT&T Labs researcher. Demand is not likely to overwhelm the Internet, he said.

Odlyzko will be in Arlington, Va., next Tuesday, March 18, giving a “Big Ideas About Information” lecture at the Information Economy Project at the George Mason University School of Law.

Back in 1999, when everyone was saying that the Internet was doubling every three months, or 1500 percent annual growth, Odlyzko was the voice of reason: the Internet was only growing at 100 percent per year.

In his “Big Ideas about Information” lecture next Tuesday, Professor Odlyzko will compare the Internet bubble of the turn of the century with the British Railway Mania of the 1840s, the greatest technology mania in history – up until the Dot.com bubble. In both cases, clear evidence indicated that financial instruments would crash.

The event, at 4 p.m., is the latest in a series sponsored by IEP, which is directed by Professor Tom Hazlett. (I serve as Assistant Director of the project.) Can’t make it to Arlington, Va., for the “Big Ideas” lecture? Join us remotely, by Webcast, or over the phone, at TalkShoe.

Well I think many of us here can appreciate Lawrence Lessig’s call to “blow up the FCC,” as he suggested in an interview with National Review this week. But I wonder, who, then, would be left to enforce his beloved net neutrality mandates and the media ownership rules he favors? He’s advocated regulation on both those fronts, but it ain’t gunna happen without some bureaucrats around to fill out the details and enforce all the red tape.

Regardless, I whole-hearted endorse his call for sweeping change. Here’s what he told National Review:

One of the biggest targets of reform that we should be thinking about is how to blow up the FCC. The FCC was set up to protect business and to protect the dominant industries of communication at the time, and its history has been a history of protectionism — protecting the dominant industry against new forms of competition—and it continues to have that effect today. It becomes a sort of short circuit for lobbyists; you only have to convince a small number of commissioners, as opposed to convincing all of Congress. So I think there are a lot of places we have to think about radically changing the scope and footprint of government.

Amen, brother. If he’s serious about this call, then I encourage Prof. Lessig to check out the “Digital Age Communications Act” project that over 50 respected, bipartisan economists and legal scholars penned together to start moving us down this path.

The rural broadband debate has been in the news a lot lately. Yesterday, DSL Reports ran a story sharply criticizing a report released by the US Internet Industry Association (an ISP lobbyist firm). But as Ars pointed out, the report actually offers some facts revealing that broadband availability in the U.S. isn’t nearly as bad some have suggested.

79 % of homes with a phone line can now get DSL, and 96 % of homes with cable can get broadband. Considering just about every home has a phone line, and most people have cable, these numbers suggest the main reason for the lack of rural broadband users isn’t the lack of availability, but the lack of adoption. Of course, rural areas have slower speeds and higher prices than urban areas. This makes sense, because building out a network in low-density areas costs more per subscriber versus urban areas, where a single apartment complex can house hundreds of users.

Still, groups argue that massive government subsidies are needed to promote broadband deployment in rural areas. ConnectedNation (a Washington-based non-profit) released a report a couple weeks ago, “The Economic Impact of Stimulating Broadband Nationally”, which concluded that accelerating broadband could pump $134 billion into the U.S. economy.

Continue reading →

I really enjoyed attending the Collective Intelligence FOO Camp, sponsored by Google and O’Reilly Media, last weekend. I’d been expecting a sort of geek slumber party, and had looked forward to rolling out my awesome Darth Vader impersonation. I was all set to cut loose with a growling, “I’m your father, Luke.” It didn’t quite come to that, but I still had a blast, meeting lots of smart, informed, articulate, creative, and successful people. Friendly people, too.

I described how to establish the legality of real money, open-access prediction markets under U.S. law. I called my presentation, Getting from Collective Intelligence to Collective Action [PPT file]. In very brief, I proposed this algorithm:

  1. Set up an enterprise prediction market, make playing it a condition of continued employment, and offer valuable prizes to the best predictors.
  2. Set up a limited access prediction market, hire a number of independent contractors researchers to play it, pay them a relatively low salary for doing so, and offer valuable prizes to the best predictors.
  3. Set up an open-access prediction market but require anyone playing it to go through a click-wrap license that creates the sort of independent researcher relationship described at step 2, above.

Continue reading →