I’ve written here before about the Clear card, which allows people to prove their membership in the Transportation Security Administration’s Registered Traveler program without telling TSA who they are. I disapprove of Registered Traveler, but if it’s going to exist, the Clear card system’s restrictiveness with users’ identities is a key anti-surveillance feature.

Today, the House Homeland Security Committee’s Subcommittee on Transportation Security and Infrastructure Protection is holding a hearing entitled “Managing Risk and Increasing Efficiency: An Examination of the Implementation of the Registered Traveler Program.”

Steven Brill, the Chairman and CEO of Clear, is one of the witnesses, and he has some choice criticisms of TSA.

Continue reading →

Good piece in the Wall Street Journal yesterday by Dennis Patrick (former FCC Chairman) and Thomas Hazlett (former FCC Chief Economist) on the Fairness Doctrine. In their editorial entitled, “The Return of the Speech Police,” they argue that the Doctrine represented “well-intended regulation gone wrong” and that “re-imposing ‘fairness’ regulation would be a colossal mistake.” The continue:

The Fairness Doctrine was bad public policy. It rested on the presumption that government regulators can coolly review editorial choices and, with the power to license (or not license) stations, improve the quantity and quality of broadcast news. Yet, as the volcanic eruption triggered by repeal amply demonstrated, government enforcement of “fairness” was extremely political.

Evaluations were hotly contested; each regulatory determination was loaded with implications for warring factions. The simple ceases to be easy once government is forced to issue blanket rules. What public issues are crucial to cover? How many contrasting views, and presented by whom, in what context, and for how long? The Fairness Doctrine brought a federal agency into the newsroom to second-guess a broadcaster’s editorial judgments at the behest of combatants rarely motivated by the ideal of “balanced” coverage.

Continue reading →

There, Too

by on July 31, 2007 · 2 comments

Commentary on recent real estate woes in Second Life. I’ve been thinking of opening an office there. Sort of a retreat. An asylum, as it were.

Worst-case Scenario

by on July 31, 2007 · 2 comments

Voting machine vendors are their own worst enemies:

The study, conducted by the university under a contract with Bowen’s office, examined machines sold by Diebold Election Systems, Hart InterCivic and Sequoia Voting Systems.

It concluded that they were difficult to use for voters with disabilities and that hackers could break into the systems and change vote results.

Machines made by a fourth company, Elections Systems & Software, were not included because the company was late in providing information that the secretary of state needed for the review, Bowen said.

Sequoia, in a statement read by systems sales executive Steven Bennett, called the UC review “an unrealistic, worst-case-scenario evaluation.”

Right. Because the way to tell if a system is secure is to focus at the best-case scenario.

I guess I shouldn’t be surprised. Voting machine vendors have a track record of releasing jaw-droppingly lame responses to criticisms of their products, so why not continue the pattern?

I agree with Tim that open networks are great and likely preferable in most situations, but to say that open networks simply “tend to be better than closed networks” doesn’t make sense.

This is akin to saying that copper is more efficient than iron. This begs the question. More efficient at what? Copper is more efficient than iron in some applications like conduction of electricity, but it’s a much less efficient armor plating. Ends dictate the standard by which we judge efficiency, otherwise efficiency is meaningless.

That said, not all networks are built for the same ends. While the Internet is an undisputed engine of growth and innovation, it’s not the only model that EVER makes sense. Closed or limited networks can also have value because Metcalfe’s Law–which states that a network’s utility increases in proportion to the square of the number of members–is not the only factor in determining network worth, despite being a very strong factor.

Continue reading →

Cord makes some good points about the disadvantages of open networks, but I think it’s a mistake for libertarians to hang our opposition to government regulation of networks on the contention that closed networks are better than open ones. Although it’s always possible to find examples on either side, I think it’s pretty clear that, all else being equal, open networks tend to be better than closed networks.

There are two basic reasons for this. First, networks are subject to network effects—the property that the per-user value of a network grows with the number of people connected to the network. Two networks with a million people each will generally be less valuable than a single network with two million people. The reason TCP/IP won the networking wars is that it was designed from the ground up to connect heterogeneous networks, which meant that it enjoyed the most potent network effects.

Second, open networks have lower barriers to entry. Here, again, the Internet is the poster child. Anybody can create a new website, application, or service on the Internet without asking anyone’s permission. There’s a lot to disagree with in Tim Wu’s Wireless Carterfone paper, but one thing the paper does is eloquently demonstrate how different the situation is in the cell phone world. There are a lot of innovative mobile applications that would likely be created if it weren’t so costly and time-consuming to get the telcos permission to develop for their networks.

Continue reading →

Tomorrow, the FCC is scheduled to meet and adopt rules regarding the upcoming auction of spectrum usage rights in 700 MHz band for wireless services. A number of interests have been crowding around, trying to get the FCC to slant the auction rules in their favor.

I’ve written a Cato TechKnowledge on the topic: “How About an Open Auction?

The rag-tag army myth has made its return — this time in a front-page story in the Washington Post. In case you don’t remember, I wrote several times last year (here, here, and here) on the persistent myth that advocates of net neutrality were an outnumbered and outgunned “‘rag-tag” army fighting against the odds. The idea of course, is preposterous — the firms supporting neutrality regulation are among the largest on Earth.

Preposterous or not, the Washington Post picked up the theme today in a piece on the FCC’s 700 Mhz auction, writing that “Google’s 12-person Washington team, based in temporary quarters on Pennsylvania Avenue, has aggressively confronted the legions of lobbyists behind the two telecom behemoths [Verizon and AT&T].

One can just imagine the poor, outnumbered Googleers fighting off endless hordes of telecom company lobbyists. Things are looking desperate, they take stock of their resources and find they are down to their last… $160 billion.

That’s right, Google’s market capitalization tops $160 billion. That’s larger than Verizon (though less than AT&T). By any meaure, Google is one of the largest corporations on earth. While perhaps new to the Washington policy world, it’s hardly outgunned in terms of resources. This is a company that pledged last week to bid $4.6 billion for spectrum if the FCC adopted the regulations it wanted. As Everett Dirksen might have said, $4.6 billion here and $4.6 million there and pretty soon you are talking about real money.

Don’t get me wrong — Google has every right to its wealth, it earned it. And I have nothing against their DC team, who all seem like nice fellows. But can we please call a halt to this game of “who’s the underdog?” These guys are big cats, and underdog’s cape would just look silly on them.

Openness–in our culture filled with feel-goodery and self congratulation openness is seen as a good thing–a trait that any liberal and modern person should hope to have. But is openness always the best policy?

Google sure thinks so. It’s advocating that the 700 Mhz spectrum–soon to be freed up by the transition to digital TV–should be auctioned with openness in mind. Eric Schmidt, Google’s CEO, has asked FCC Chairman Martin to limit the auction to models that would include open applications, open devices, open services, and open networks.

Sounds great doesn’t it? After all, other open things in the political world are good. Open government, open hearings–both good. But would we want open phone conversations or open email? Maybe open doors and open shades would be a good idea. What do you have to hide?

Living in a democracy we’re used to transparency, but certainly we can recognize the value of limits and closed proceedings as well. What about limited and closed models for networks? Can these be of any benefit or are they, like the technocrats claim they are, just stifling innovation?

Closed networks, or rather networks that aren’t wide-open, offer some significant advantages. Security, for one, is markedly enhanced by a closed or limit-access system. That’s why our national security system, at least those outside the Pentagon’s email servers, are often totally severed from the wide-open internet.

An open network, like the internet itself, is prone to all variety of attacks. By contrast, I’ve never gotten a cell phone virus, something I owe to my cell carrier’s closed system. My phone also seldom crashes, unlike my PC. I owe much of my PC woes to the OS I’m sure, but the various apps I have running are likely not custom made for my particular machine, unlike the apps found on many cell phones.

Let’s think different for a moment and consider Apple. Mac has always been a fairly limited–if not closed–system, yet this walled-garden isn’t seen as an evil. That’s likely because Macs works so well, but its crucial to recognize that much of this is owed to Mac’s closed architecture, something that eliminates many of the variables that plague PCs.

Google may have a business model that makes sense under their proposed restrictions, but forcing the model on others isn’t because of some overarching philosophy of “openness.” Rather, Google wants to save money on the auctions by driving out many of the bidders. This is a shame. While an open wireless network is intriguing and could create a platform for unique innovations, limited networks will still offer stability, compatibility, security, and privacy and should be allowed to compete.

Google’s Policy Blog today makes a succinct argument for why its purchase of DoubleClick should be approved. While I find their reasoning compelling and logical–in fact, I don’t think any justification should be necessary–I find it hard to be sympathetic to a plea for fairness when Google is asking DC to stack the deck in its favor on other issues.

Example: Google has issued an ultimatum to the FCC, asking it to offer up the 700 Mhz spectrum–the radio waves that will be free when TVs switch over to digital in 2009–with conditions attached. These conditions would make all potential bidders conform to Google’s business model.

What other example in history do we have of a company actually demanding strings be attached to an FCC auction such as this? If anyone can think of such an example I encourage you to comment. As far as I know, this is totally unprecedented.

And why ask the FCC to place limits on something you plan to buy? That seems a little odd. Unless you want to reduce the value of the spectrum to competitors that operate under different models.

What about these other models? More on that later when I discuss the idea of ‘openess’ in a post later today.

These types of restrictions are just political games, which Google doesn’t like when they prevent Larry and Sergey from making an aquisition or collecting different kinds of data. Yet the same political maneuvering is just fine when the men of Moutain View can use hapless regulators to make a mint at the public’s expense.

Hat Tip: John Battelle’s Searchblog