Wireless & Spectrum Policy

USA Today reports that most are unaware of the dangers facing them at public Wi-Fi hotspots, which brought to mind an interesting question about municipal Wi-Fi. What incentive is there for municipalities to provide encryption and other security technologies?

The article mentions that AT&T and T-Mobile are the largest providers of free Wi-Fi hookups in the country and although the Wi-Fi itself is unsecured, both companies encourage the use of freely provided encryption software. The incentives for both companies seem fairly obvious. If people are going to be Wi-Fi users they need to feel safe and encryption technology is a way to do this. Customers stay safe and continue to use the service, making AT&T T-Mobile and other providers money.

Do municipal setups have the same incentives? Depending on the financial structure of such a system I can see how there would be little incentive to provide security software or other safeguards to users. Yet these Muni-Fi services would still distort the market, making it less likely for companies–that might be affected by privacy concerns–to invest in those areas.

Question: Does Muni-Fi pose a risk to security because of the lack of incentives to push security solutions and its edging out private competitors who have that motivation?

In the most recent podcast, Jim Harper and I had a little back-and-forth about the idea of a commons model for spectrum. I made the point that while I was hopeful for the future, technology that makes spectrum scarcity a thing of the past (thus allowing a commons to work) isn’t quite here yet. Regulating based on theoretical technology, I said, doesn’t bode well for the here and now.

Well, today comes word that the FCC has rejected the mystery whitespace devices that Google, Microsoft, and others in a consortium pushing for commons treatment of parts of the 700 MHz, had offered for testing. A year ago, the New America Foundation put out a paper called “Why Unlicensed Use of Vacant TV Spectrum Will Not Interfere with Television Reception.” According to The Washington Post today,

After four months of testing, the agency concluded that the devices either interfered with TV signals or could not detect them in order to skirt them. Now the coalition of companies backing the devices, which includes Dell, Intel, EarthLink, Hewlett-Packard and Philips, is going back to the drawing board, possibly to redesign the devices and meet with FCC engineers to explore other options. The FCC said Tuesday that it would continue experimenting with such devices, which use vacant TV frequencies.

I really hope they succeed because I don’t think there’s anything wrong with allowing free use of true whitespaces or commons as long as the technology really works and use truly doesn’t cause interference to an adjacent licenses holder. That said, we can’t devalue otherwise useful spectrum by allocating it as a commons until we know the tech works.

Harold Feld speculates on a Google/Sprint/Clearwire consortium. All well and good – and just as viable without a government subsidy, in an open, full-price auction.

The FCC’s 700 MHz plan adopted yesterday embraces, for the most part, Frontline Wireless’s plan for a national public safety network. It’s really an amazing thing considering that nine months ago Frontline Wireless didn’t exist (at least not in public), while Cyren Call had been making noise for months. As I’ve said before, I’m not crazy about Frontline’s plan, but I like it better than Cyren Call’s ill-fated proposal. That said, here are the pros and cons of the new rules ad I see them (and without the benefit of the actual rules in front of me because the FCC apparently hasn’t heard of this publishing technology called the World Wide Web).

Continue reading →

Randy May of the Free State Foundation has a good piece out today, picking up on an prediction by the investment firm of Stifel Nicolaus that the exact meaning of “open access” under yesterday’s 700 MHz decision likely won’t be determined for years. Stifel Nicolaus says 2009 is the likely date — that strikes May (and me) as optimistic, given the eight years it took to settle the unbundling rules under the 1996 telecom act.

This definitional long tail has consequences, May points out. This is because that veritable economic theorem that “people don’t want to provide a pig in a poke” holds true, even for the FCC. “Think about it,” he says. “In how many auctions have you bid when the rules concerning what you can do with your winning bid won’t be known until several years later?”

A good, but hardly reassuring, point. So you might as well get comfortable. This may go on for a while.

After weeks of intense lobbying, the FCC today set rules for the auction of former UHF TV channels 60-69 (in the prime 700 MHz range of frequencies). The full details are not yet out, but the decision seems to be largely what was expected: a “public-private partnership” for newly-allocated public safety spectrum, and — for commercial spectrum — new regulations that impose “open access” rules on 22 megahertz of the allocated frequencies.

No one was completely satisfied. Google and other wireless net neutrality proponents notably failed in their bid for more expansive regulation — with the Commission rejecting their calls for mandated interconnection and wholesale leasing of spectrum.

This loss — in part — may be due to a tactical fumble by Google itself. Its pledge last week to bid a minimun of $4.6 billion if the Commission adopted four proposed rules for these frequencies was perceived (rightly or wrongly) as an ultimatum to the FCC. Had the Commission then adopted the Google’s proposed rules, the agency’s own credibility and independence would have been put at risk.

Continue reading →

Cord makes some good points about the disadvantages of open networks, but I think it’s a mistake for libertarians to hang our opposition to government regulation of networks on the contention that closed networks are better than open ones. Although it’s always possible to find examples on either side, I think it’s pretty clear that, all else being equal, open networks tend to be better than closed networks.

There are two basic reasons for this. First, networks are subject to network effects—the property that the per-user value of a network grows with the number of people connected to the network. Two networks with a million people each will generally be less valuable than a single network with two million people. The reason TCP/IP won the networking wars is that it was designed from the ground up to connect heterogeneous networks, which meant that it enjoyed the most potent network effects.

Second, open networks have lower barriers to entry. Here, again, the Internet is the poster child. Anybody can create a new website, application, or service on the Internet without asking anyone’s permission. There’s a lot to disagree with in Tim Wu’s Wireless Carterfone paper, but one thing the paper does is eloquently demonstrate how different the situation is in the cell phone world. There are a lot of innovative mobile applications that would likely be created if it weren’t so costly and time-consuming to get the telcos permission to develop for their networks.

Continue reading →

Tomorrow, the FCC is scheduled to meet and adopt rules regarding the upcoming auction of spectrum usage rights in 700 MHz band for wireless services. A number of interests have been crowding around, trying to get the FCC to slant the auction rules in their favor.

I’ve written a Cato TechKnowledge on the topic: “How About an Open Auction?

The rag-tag army myth has made its return — this time in a front-page story in the Washington Post. In case you don’t remember, I wrote several times last year (here, here, and here) on the persistent myth that advocates of net neutrality were an outnumbered and outgunned “‘rag-tag” army fighting against the odds. The idea of course, is preposterous — the firms supporting neutrality regulation are among the largest on Earth.

Preposterous or not, the Washington Post picked up the theme today in a piece on the FCC’s 700 Mhz auction, writing that “Google’s 12-person Washington team, based in temporary quarters on Pennsylvania Avenue, has aggressively confronted the legions of lobbyists behind the two telecom behemoths [Verizon and AT&T].

One can just imagine the poor, outnumbered Googleers fighting off endless hordes of telecom company lobbyists. Things are looking desperate, they take stock of their resources and find they are down to their last… $160 billion.

That’s right, Google’s market capitalization tops $160 billion. That’s larger than Verizon (though less than AT&T). By any meaure, Google is one of the largest corporations on earth. While perhaps new to the Washington policy world, it’s hardly outgunned in terms of resources. This is a company that pledged last week to bid $4.6 billion for spectrum if the FCC adopted the regulations it wanted. As Everett Dirksen might have said, $4.6 billion here and $4.6 million there and pretty soon you are talking about real money.

Don’t get me wrong — Google has every right to its wealth, it earned it. And I have nothing against their DC team, who all seem like nice fellows. But can we please call a halt to this game of “who’s the underdog?” These guys are big cats, and underdog’s cape would just look silly on them.

Openness–in our culture filled with feel-goodery and self congratulation openness is seen as a good thing–a trait that any liberal and modern person should hope to have. But is openness always the best policy?

Google sure thinks so. It’s advocating that the 700 Mhz spectrum–soon to be freed up by the transition to digital TV–should be auctioned with openness in mind. Eric Schmidt, Google’s CEO, has asked FCC Chairman Martin to limit the auction to models that would include open applications, open devices, open services, and open networks.

Sounds great doesn’t it? After all, other open things in the political world are good. Open government, open hearings–both good. But would we want open phone conversations or open email? Maybe open doors and open shades would be a good idea. What do you have to hide?

Living in a democracy we’re used to transparency, but certainly we can recognize the value of limits and closed proceedings as well. What about limited and closed models for networks? Can these be of any benefit or are they, like the technocrats claim they are, just stifling innovation?

Closed networks, or rather networks that aren’t wide-open, offer some significant advantages. Security, for one, is markedly enhanced by a closed or limit-access system. That’s why our national security system, at least those outside the Pentagon’s email servers, are often totally severed from the wide-open internet.

An open network, like the internet itself, is prone to all variety of attacks. By contrast, I’ve never gotten a cell phone virus, something I owe to my cell carrier’s closed system. My phone also seldom crashes, unlike my PC. I owe much of my PC woes to the OS I’m sure, but the various apps I have running are likely not custom made for my particular machine, unlike the apps found on many cell phones.

Let’s think different for a moment and consider Apple. Mac has always been a fairly limited–if not closed–system, yet this walled-garden isn’t seen as an evil. That’s likely because Macs works so well, but its crucial to recognize that much of this is owed to Mac’s closed architecture, something that eliminates many of the variables that plague PCs.

Google may have a business model that makes sense under their proposed restrictions, but forcing the model on others isn’t because of some overarching philosophy of “openness.” Rather, Google wants to save money on the auctions by driving out many of the bidders. This is a shame. While an open wireless network is intriguing and could create a platform for unique innovations, limited networks will still offer stability, compatibility, security, and privacy and should be allowed to compete.