Wireless & Spectrum Policy

Some impressive numbers here from the CTIA’s Semi-Annual Wireless Industry Survey.  There are now more than 276 million wireless users in the U.S., which is almost 14 million more subscribers that there were at this point last year. (Seriously, is there anyone in America who doesn’t have their own phone in their pocket or purse these days?) More amazing is the seemingly never-ending explosive growth of text messaging. The CTIA report that:

text messaging continues to be enormously popular, with more than 740 billion text messages carried on carriers’ networks during the first half of 2009—breaking down to 4.1 billion messages per day. That’s nearly double the number from last year, when only 385 billion text messages were reported for the first half of 2008. Wireless subscribers are also sending more pictures and other multi-media messages with their mobile devices—more than 10.3 billion MMS messages were reported for the first half of 2009, up from 4.7 billion in mid-year 2008.

Most of us probably hadn’t even sent one text message ten years ago. And now there 4.1 billion of the suckers flying off our phones every day.  That is astonishing.

And we’re still gabbing plenty, too. “[W]ireless customers have already used more than 1.1 trillion minutes in the first half of 2009—breaking down to 6.4 billion minutes-of-use per day.”  As the Grim Reaper said in Monty Python’s “Meaning of Life“…  “You always talk, you Americans. You talk and you talk..”

Interesting piece here from Slate’s Farhad Manjoo on why AT&T should dump unlimited data plans and end what he calls the “iPhone all-you-can-eat buffet.”  He notes that: “The typical smartphone customer consumes about 40 to 80 megabytes of wireless capacity a month. The typical iPhone customer uses 400 MB a month. AT&T’s network is getting crushed by that demand.” Because “some iPhone owners are hogging the network” and causing “a slowed-down wireless network,” Manjoo recommends a congestion pricing model as a method of balancing supply and demand:

How would my plan work? I propose charging $10 a month for each 100 MB you upload or download on your phone, with a maximum of $40 per month. In other words, people who use 400 MB or more per month will pay $40 for their plan, or $10 more than they pay now. Everybody else will pay their current rate—or less, as little as $10 a month. To summarize: If you don’t use your iPhone very much, your current monthly rates will go down; if you use it a lot, your rates will increase. (Of course, only your usage of AT&T’s cellular network would count toward your plan; what you do on Wi-Fi wouldn’t matter.)

To understand the advantages of tiered pricing, let’s look at AT&T’s current strategy of spending billions to build more network space. Why won’t this work? For the same reason building more roads doesn’t reduce traffic—more capacity increases the attractiveness of driving, which brings a lot more cars to the road, which leads to more gridlock.

Congestion pricing and metering is something I’ve written quite a bit about here in the context of wireline broadband (1, 2, 3), but Manjoo is equally correct that it could be applied for wireless data plans.  It has the added value of taking pressure off lawmakers to impose Net neutrality regulation since pricing of the pipe becomes an effective substitute for most other forms of network management. In other words, price, don’t block bandwidth-hogging customers and applications.  The problem, Manjoo explains: Continue reading →

Tomorrow, Friday, Oct. 2, the Information Economy Project at the George Mason University School of Law will hold a conference on Michael Heller’s new book The Gridlock Economy. Surprisingly Free will be streaming live video of the the conference kick-off debate between Heller and Richard Epstein at 8:30 a.m. (It will also be available for download later for folks allergic to early mornings.)

Called “Tragedies of the Gridlock Economy: How Mis-Configuring Property Rights Stymies Social Efficiency,” the conference will

explore a paradox that broadly affects the Information Economy. Property rights are essential to avoid a tragedy of the commons; defined properly, such institutions yield productive incentives for creation, conservation, discovery and cooperation. Applied improperly, however, such rights can produce confusion, wasteful rent-seeking, and a tragedy of the anti-commons.

This conference, building on Columbia University law professor Michael Heller’s book, The Gridlock Economy, tackles these themes through the lens of three distinct subjects: “patent thickets,” reallocation of the TV band, and the Google Books copyright litigation.

In the meantime, check out this video of Michael Heller at Google giving his elevator pitch.

Last Wednesday, Holman Jenkins penned a column in The Wall Street Journal about net neutrality (Adam discussed it here). In response, I have a letter to the editor in today’s The Wall Street Journal:

To the Editor:

Mr. Jenkins suggests that Google would likely “shriek” if a startup were to mount its servers inside the network of a telecom provider. Google already does just that. It is called “edge caching,” and it is employed by many content companies to keep costs down.

It is puzzling, then, why Google continues to support net neutrality. As long as Google produces content that consumers value, they will demand an unfettered Internet pipe. Political battles aside, content and infrastructure companies have an inherently symbiotic relationship.

Fears that Internet providers will, absent new rules, stifle user access to content are overblown. If a provider were to, say, block or degrade YouTube videos, its customers would likely revolt and go elsewhere. Or they would adopt encrypted network tunnels, which route around Internet roadblocks.

Not every market dispute warrants a government response. Battling giants like Google and AT&T can resolve network tensions by themselves.

Ryan Radia

Competitive Enterprise Institute

Washington

To be sure, the market for residential Internet service is not all that competitive in some parts of the country — Rochester, New York, for instance — so a provider might in some cases be able to get away with unsavory practices for a sustained period without suffering the consequences. Yet ISP competition is on the rise, and a growing number of Americans have access to three or more providers. This is especially true in big cities like Chicago, Baltimore, and Washington D.C.

Instead of trying to put a band-aid on problems that stem from insufficient ISP competition, the FCC should focus on reforming obsolete government rules that prevent ISP competition from emerging. Massive swaths of valuable spectrum remain unavailable to would-be ISP entrants, and municipal franchising rules make it incredibly difficult to lay new wire in public rights-of-way for the purpose of delivering bundled data and video services.

One of the projects I run is OpenRegs.com, an alternative interface to the federal government’s official Regulations.gov site. With the help of Peter Snyder, we recently developed an iPhone app that would put the Federal Register in your pocket. We duly submitted it to Apple over a week ago, and just received a message letting us know that the app has been rejected.

Action IconThe reason? Our app “uses a standard Action button for an action which is not its intended purpose.” The action button looks like the icon to the right.

According to Apple’s Human Interface Guidelines, its purpose is to “open an action sheet that allows users to take an application-specific action.” We used it to bring up a view from which a user could email a particular federal regulation. Instead, we should have used an envelope icon or something similar. Sounds like an incredibly fastidious reason to reject an application, right? It is, and I’m glad they can do so.
Continue reading →

In a week in which neutrality regulation is making a lot of news, I hope that Robert Hahn and Hal Singer’s terrific new study, “Why the iPhone Won’t Last Forever and What the Government Should Do to Promote its Successor” gets some attention. It provides a wonderful overview of how dynamically competitive the mobile marketplace has been over the past two decades and why critics are wrong to get worked up about the short-term “dominance” of Apple’s iPhone. Here’s the abstract of their paper:

Because of the overwhelming, positive response to the iPhone as compared to other smart phones, exclusive agreements between handset makers and wireless carriers have come under increasing scrutiny by regulators and lawmakers. In this paper, we document the myriad revolutions that have occurred in the mobile handset market over the past twenty years. Although casual observers have often claimed that a particular innovation was here to stay, they commonly are proven wrong by unforeseen developments in this fast-changing marketplace. We argue that exclusive agreements can play an important role in helping to ensure that another must-have device will soon come along that will supplant the iPhone, and generate large benefits for consumers. These agreements, which encourage risk taking, increase choice, and frequently lower prices, should be applauded by the government. In contrast, government regulation that would require forced sharing of a successful break-through technology is likely to stifle innovation and hurt consumer welfare.

“New technologies often seemingly emerge from nowhere, but also frequently lose their luster quickly,” Hahn and Singer go on to argue. As evidence they cite the recent examples of Second Life and MySpace, which were hyped as potentially become dominant providers in their respective areas just a few years ago, but now are subjected to intense competition. “[T]he the mobile handset market is subject to these same disruptive forces,” they argue: Continue reading →

FOXNews.com has just published an editorial that I penned about Monday’s net neutrality announcement from the FCC.

Does Obama Want to Control the Internet?

by Ryan Radia

The federal government may gain broad new powers to regulate InternetObama Economy providers next month if Federal Communications Commission Chairman Julius Genachowski gets his way. In a milestone speech on Monday, Genachowski proposed sweeping new regulations that would give the FCC the formal authority to dictate application and network management practices to companies that offer Internet access, including wireless carriers like AT&T and Verizon Wireless.

Genachowski’s proposed rules would make good on a pledge that President Obama made in his campaign to enshrine net neutrality as law. The announcement was met with cheers by a small but vocal crowd of activists and academics who have been pushing hard for net neutrality for years. But if bureaucrats and politicians truly care about neutrality, they would be wise to resist calls to expand the government’s power over private networks. Instead, policymakers should recognize that it is far more important for government to remain neutral to competing business models — open, closed, or any combination thereof.

Continue reading →

Over at his always-informative Spectrum Blog, wireless guru Michael Marcus brings to my attention a new report that will definitely be of interest to everyone here about “The Economic Value Generated by Current and Future Allocations of Unlicensed Spectrum.”  It was written by Rich Thanki of Perspective Associates, a UK consulting firm. I haven’t had time to finish the whole thing yet, but it basically lays out the argument for opening up more spectrum, especially “white spaces,” to unlicensed use.

Anyway, Mike Marcus has an much better write-up of the report than I could ever do, so head over there to check out his discussion.  One important thing that Mike stresses is the importance of technical flexibility:

But the key issue here is not the presence or absence of a license, the key issue is deregulation. A major reason why unlicensed networks have been so innovative is that the descendants of the FCC Docket 81-413 rulemaking, e.g. Wi-Fi, Bluetooth, and Zigbee have been in spectrum bands with great technical flexibility… If you overregulate unlicensed systems, they can stagnate just as much as licensed one often do.

I think that is an important insight and essential lesson that we should always keep in mind when it comes to spectrum policy, regardless of whether we talking about licensed or unlicensed spectrum.  Although I’ve always been a bit torn about how much spectrum should be allocated on an unlicensed (or “commons”) basis versus auctioned (property rights model), as Marcus suggests, flexibility is crucial in either case.   In all the heated catfights over licensed and unlicensed spectrum, that point sometimes gets overlooked.

Forbes.com has just published an editorial that Berin Szoka and I penned about yesterday’s net neutrality announcement from the FCC.

The Day Internet Freedom Died

by Adam Thierer & Berin Szoka

There was a time, not so long ago, when the term “Internet Freedom” actually meant what it implied: a cyberspace free from over-zealous legislators and bureaucrats. For a few brief, beautiful moments in the Internet’s history (from the mid-90s to the early 2000s), a majority of Netizens and cyber-policy pundits alike all rallied around the flag of “Hands Off the Net!” From censorship efforts, encryption controls, online taxes, privacy mandates and infrastructure regulations, there was a general consensus as to how much authority government should have over cyber-life and our cyber-liberties. Simply put, there was a “presumption of liberty” in all cyber-matters.

Those days are now gone; the presumption of online liberty is giving way to a presumption of regulation. A massive assault on real Internet freedom has been gathering steam for years and has finally come to a head. Ironically, victory for those who carry the banner of “Internet Freedom” would mean nothing less than the death of that freedom.

We refer to the gradual but certain movement to have the federal government impose “neutrality” regulation for all Internet actors and activities—and in particular, to yesterday’s announcement by Federal Communications Commission (FCC) Chairman Julius Genachowski that new rules will be floated shortly. “But wait,” you say, “You’re mixing things up! All that’s being talked about right now is the application of ‘simple net neutrality,’ regulations for the infrastructure layer of the net.” You might even claim regulations are not really regulation but pro-freedom principles to keep the net “free and open.”

Such thinking is terribly short-sighted. Here is the reality: Because of the steps being taken in Washington right now, real Internet Freedom—for all Internet operators and consumers, and for economic and speech rights alike—is about to start dying a death by a thousand regulatory cuts. Policymakers and activists groups are ramping up the FCC’s regulatory machine for a massive assault on cyber-liberty. This assault rests on the supposed superiority of common carriage regulation and “public interest” mandates over not just free markets and property rights, but over general individual liberties and freedom of speech in particular. Stated differently, cyber-collectivism is back in vogue—and it’s coming very soon to a computer near you! Continue reading →

If I can amplify a bit on a post at the Cato blog earlier today, I want to clarify that I fully agree some of the ISP behaviors that net neutrality proponents have identified as demanding a regulatory response really are seriously problematic. My point of departure is that I’d rather see if there are narrower grounds for addressing the objectionable behaviors than making sweeping rules about network architecture. So in the case of Comcast’s throttling of BitTorrent, which is the big one that seems to confirm the fears of the neutralists, I think it’s significant that for a long while the company was—”lying about” assumes intent, so  I’ll charitably go with “misrepresenting”—their practices. And I don’t think you need any controversial premises about optimal network management to think that it’s impermissible for a company to charge a fee for a service, and then secretly cripple that service. So without even having to hit the more controversial “nondiscrimination” principle Julius Genachoswki proposed on Monday, you can point to this as a failure of the “transparency” principle, about which I think there’s a good deal more consensus. Now, there are bigger guns out there looking for dodgy filtering practices these days, so I’d expect the next attempt at this sort of thing to get caught more quickly, but by all means, enforce transparency about business practices too. Consumers have a right to get the service they’ve bought without having to be 1337 haxx0rz to discover how they’re being shortchanged. But before we get the feds involve in writing code for ISP routers, I’d like to see whether that proves sufficient to limit genuinely objectionable deviations from neutrality.

There’s a hoary rule of jurisprudence called the canon of constitutional avoidance. It means, very crudely, that judges don’t decide broad constitutional questions—they don’t go mucking with the basic architecture of the legal system—when they have some narrower grounds on which to rule. So if, for instance, there are two reasonable interpretations of a statute, one of which avoids a potential conflict with a constitutional rule, judges are supposed to prefer that interpretation. It’s not always possible, of course: Sometime judges have to tackle the big, broad questions. But it’s supposed to be something of a last resort. Lawyers and civil liberties advocates, of course, tend to get more animated by those broad principles, whether the First Amendment or end-to-end. But there’s often good reason to start small—to look to the specific fact patterns of problem cases and see whether there are narrower bases for resolution. It may turn out that in the kinds of cases that neutralists rightly warn could harm innovation, it’s not one big principle, but a diverse array of responses or fixes that will resolve the different issues. In a case like this one, perhaps a mix of mandated transparency, consumer demand, and user adaptation (e.g. encrypting traffic) will get you the same (or a better) result than an architectural mandate.

Continue reading →