Tomorrow is the 2008 Politics Online Conference, and I’m prepping for the event by guest-blogging over at the Institute for Politics, Democracy and the Internet. The panel I’ll be moderating is called “Building a Broadband Strategy for America,” and you can read more about it at the Politics Online site.
I’ve blogged about this panel on this site previously, so I won’t recount that, other than to repeat that I’ll be joined by FCC Commissioner Jonathan Adelstein, by Professor Tim Wu, who coined the term “Net Neutrality,” and by Eric Werner, a senior official at the National Telecommunications and Information Administration of the Commerce Department.
On the IPDI blog, I address how BroadbandCensus.com plays into the National Broadband Strategy debate:
As a technology reporter, I’ve been writing about the battles over broadband for nearly a decade here in Washington. There is one fact about which nearly everyone seems to be in agreement: if America wants better broadband, America need better broadband data. That’s why I’ve recently started a new venture to collect this broadband data, and to make the data available for all on the Web at BroadbandCensus.com.
Read the rest of Want Better Broadband in America? Take the BroadbandCensus.com!
Larry Lessig is a gifted writer, and he does a good job of finding and telling stories that illustrate the points he’s trying to make. I found Free Culture extremely compelling for just this reason; he does a great job of illustrating a fairly subtle but pervasive problem by finding representative examples and weaving together a narrative about a broader problem. He demonstrates that copyright law has become so pervasive that peoples’ freedom is restricted in a thousand subtle ways by its over-broad application.
He takes a similar approach in
Code, but the effort falls flat for me. Here, too, he gives a bunch of examples in which “code is law”: the owners of technological systems are able to exert fine-grained control over their users’ online activities. But the systems he describes to illustrate this principle have something important in common: they are proprietary platforms whose users have chosen to use them voluntarily. He talks about virtual reality, MOOs and MUDs, AOL, and various web-based fora. He does a good job of explaining that the different rules of these virtual spaces—their “architecture”—has a profound impact on their character. The rules governing an online space interact in complex ways with their participants to produce a wide variety of online spaces with distinct characters.
Lessig seems to want to generalize from these individual communities to the “community” of the Internet as a whole. He wants to say that if code is law on individual online communications platforms, then code must be law on the Internet in general. But this doesn’t follow at all. The online fora that Lessig describes are very different from the Internet in general. The Internet is a neutral, impersonal platform that supports a huge variety of different applications and content. The Internet as a whole is not a community in any meaningful sense. So it doesn’t make sense to generalize from individual online communities, which are often specifically organized to facilitate control, to the Internet in general which was designed with the explicit goal of decentralizing control to the endpoints.
Also, the cohesiveness and relative ease of control one finds on the Internet occurs precisely because users tend to use any given online service voluntarily. Users face pressures to abide by the generally-accepted rules of the community, and users who feel a given community’s rules aren’t a good fit will generally switch to a new one rather than make trouble. In other words, code is law in individual Internet communities precisely because there exists a broader Internet in which code is not law. When an
ISP tries to control its users’ online activities, users are likely to react very differently. As we’ve seen in the case of the Comcast kerfuffle, users do not react in a docile fashion to ISPs that attempt to control their online behavior. And at best, such efforts produce only limited and short-term control.
I’m re-reading Larry Lessig’s Code and Other Laws of Cyberspace. I last read it about four years ago, long enough that I’d forgotten a lot of the specific claims Lessig made. One of the things that I think has clearly not occurred is his prediction that we would develop a “general architecture of trust” that would “permit the authentication of a digital certificate that verifies facts about you—your identity, citizenship, sex, age, or the authority you hold.” Lessig thought that “online commerce will not fully develop until such an architecture is established,” and that way back in 1999, we could “see enough to be confident that it is already developing.”
Needless to say, this never happened, and it now looks unlikely that it ever will happen. The closest we came was with Passport, which was pretty much a flop. We have instead evolved a system in which people have dozens of lightweight online identities for the different websites they visit, many of which involve little more than setting a cookie on one’s browser. The kind of universal, monolithic ID system that would allow any website to quickly and transparently learn who you are seems much less likely today than it apparently seemed to Lessig in 1999.
Of course, this would have been obvious to Lessig if he’d had the chance to read Jim Harper’s Identity Crisis. Jim explained that the security of an identifier is a function not only of the sophistication of its security techniques, but also of the payoff for breaking it. A single, monolithic identifier is a bad idea because it becomes an irresistible target for the bad guys. It’s also insufficiently flexible: Security rules that are robust enough for online banking is going to be overkill for casual web surfing. What I want, instead, are a range of identifiers of varying level of security, tailored to the sensitivity of the systems to which they control access.
Online security isn’t much about technology at all. For example, the most important safeguard against online credit card fraud isn’t SSL. It’s the fact that someone trying to buy stuff with a stolen credit card has to give a delivery location, which can be used by the police to apprehend him. Our goal isn’t and shouldn’t be maximal security in every transaction. Rather, it’s to increase security until the costs of additional security on the margin cease to outweigh the resulting reductions in fraud. If the size of a transaction is reasonably low, and most people are honest, quite minimalist security precautions may be sufficient to safeguard it. That appears to be what’s happened so far, and Lessig’s prediction to the contrary is starting to look rather dated.
So I’ve finished reading the Frischmann paper. I think it makes some interesting theoretical observations about the importance of open access to certain infrastructure resources. But I think the network neutrality section of the paper is weakened by a lack of specificity about what’s at stake in the network neutrality debate. He appears to take for granted that the major ISPs are able and likely to transform the Internet into a proprietary network in the not-too-distant future. Indeed, he seems to regard this point as so self-evident that he frames it as a simple political choice between open and closed networks.
But I think it’s far from obvious that anyone has the power to transform the Internet into a closed network. I can count the number of serious reports of network neutrality violations on my fingers, and no ISP has even come within the ballpark of transforming its network into a proprietary network like AOL circa 1994. Larry Lessig raised the alarm about that threat a decade ago, yet if anything, things have gotten more, not less, open in the last decade. We have seen an explosion of mostly non-commercial, participatory Internet technologies like Wikipedia, Flickr, blogs, YouTube, RSS, XMPP, and so forth. We have seen major technology companies, especially Google, throw their weight behind the Intenet’s open architecture. I’m as happy as the next geek to criticize Comcast’s interference with BitTorrent, but that policy has neither been particularly successful in preventing BitTorrent use,
nor emulated by other ISPs, nor a harbinger of the imminent AOL-ization of Comcast’s network.
Before we can talk about whether a proprietary Interent is desirable (personally, I think it isn’t), I think we have to first figure out what kind of changes are plausible. I have yet to see anyone tell a coherent, detailed story about what AT&T or Verizon could do to achieve the results that Lessig, Benkler, Wu, and Frischmann are so worried about. Frischmann, like most advocates of network neutrality regulation, seem to simply assume that ISPs have this power and move on to the question of whether it’s desirable. But in doing so, they’re likely to produce regulation that’s not only unnecessary, but quite possibly incoherent. If you don’t quite know what you’re trying to prevent, it’s awfully hard to know which regulations are necessary to prevent it.
This may not be of interest to anyone but me, but on the theory that it’s better to post them all and let Google sort them out, here are the books I read for my forthcoming paper on network neutrality.
Continue reading →
This account of the FCC’s Boston meeting on Comcast’s network management policies, from Tom Giovannetti.
Somewhere, there are more sophisticated arguments for net neutrality:
The setting where a monopoly infrastructure business, in pursuit of its own ends, could take arbitrary steps that would ruin one business and make another succeed, were regarded as inimical to a really free market. It resembled far too much the widely disliked markets without property rights, dominated by a capricious political power. So what followed was a long period of increasingly stringent regulation.
One might conclude from this discussion of historical precedents for regulation of networks that something like network neutrality ought to be attempted. My take in brief is that that Andrew’s paper understates the aspects of the cure that are worse than the disease and neglects the history of networks beyond a simple pricing story. It is time to try another approach. But there could be some interesting discussion.
Interestingly, though, the current trend in the Comcast proceeding bears no resemblance to a reasoned attempt to provide a real solution to any real problem, to consider the history of networks, or to consider a range of solutions and their tradeoffs. It seems to be an exercise in pure faddish populism. Curious. One wonders what a court will make of it. Mincemeat, I suspect.
Dominance in the broadband market is a battle of both technology and politics. Right now Comcast, America’s leading cable company, is losing on both counts.
Comcast Executive Vice President David Cohen emerged from the Federal Communications Commission’s hearing on Internet practices in Cambridge, Mass., as unable to defend himself and his company against charges of blocking the peer-to-peer (P2P) Internet application BitTorrent.
Comcast also came out looking like the kind of bullying corporation that resorts to packing the auditorium with its own employees.
Besides, Comcast is not a very good FOK, or Friend of Kevin — as in Kevin Martin, the chairman of the agency. Martin has done nearly everything in his power to harm Comcast and the cable industry since he took over the FCC in March 2005.
Continue reading →
The Federal Communications Commission conducted a public hearing this week on network management before a group of law students – as opposed to, say, engineering students who are the ones who study network management – where lead witness Rep. Ed Markey (D-MA) declared
[T]he Internet is as much mine and yours as it is Verizon’s, AT&T’s or Comcast’s. Please keep front and center in your examination the needs and wishes of the community of users rather than a small coterie of carriers.
As a matter of law, Markey would have flunked if that were an exam question. But of course the government has a right to try to control whatever it wishes one way or another.
The interesting and relevant question is whether and to what degree it’s possible to proscribe network management practices which most reasonable people would consider inappropriate without unintentionally preventing network providers from trying to improve their services while earning a competitive return on their investment.
“[C]learly, complicated network architectures, Internet viruses, and capacity limitations raise real-world, complex and valid questions, conceded FCC Commissioner Michael J. Copps. “Our job is to figure out when and where you draw the line between discrimination and reasonable network management.”
Continue reading →
One of the many reasons that those of who us cherish free markets and limited government oppose net neutrality regulation is because we believe it will be a major step down the slippery slope to far more comprehensive regulation of the Internet. Once we let this regulatory genie out of the bottle and the bureaucrats get their tentacles around the Net, a host of other misguided restrictions on Internet activities will likely follow.
One of the more destructive of these potential outcomes would be full-blown structural separation of broadband networks, such that government would force network owners to spin off their retail arms and become pure wholesalers of access (on government-set terms and price-controlled rates, of course). In a nutshell, this is the old regulatory playbook that did very little to benefit consumers or competition. Amazingly, however, we already have someone suggesting it as the logical next step after we get done slapping net neutrality mandates on the Internet. Writing in the
Boston Globe on Saturday, David Weinberger a fellow at the Harvard Berkman Center, says we need to take the next step and think about busting up broadband networks into atomistic bits:
“An Internet delivered by a tiny handful of old-technology providers, even if constrained by Net neutrality, doesn’t get us to the second vision. It doesn’t give us access laid like a blanket over the entire country, rich and poor alike. It doesn’t give us a Net that we make together, rather than a Net the contents of which we consume. For that, we need more than Net neutrality. We need a structural change. We gave the incumbent providers their chance. They have failed. The FCC could decide to once again require them to act as wholesalers to local Internet Service Providers, which would offer genuine competition on price, access, reliability, services, and whatever other differentiators an open market would devise.”
Back in 2002, Wayne Crews and I penned a paper for Cato entitled, “The Digital Dirty Dozen: The Most Destructive High-Tech Legislative Measures of the 107th Congress,” and we named a structural separation proposal floating through Congress at that time as the single most destructive measure of the year. What we said then of structural separation for older wireline telecom networks is every bit as true today regarding proposals to impose structural separation on broadband networks–perhaps even more so since we would be talking about structural separation for telco, cable and wireless networks. As Wayne and I argued back in ’02:
Continue reading →
Got busy last week and failed to blog about this Wall Street Journal column by my colleague Bret Swanson and tech visionary George Gilder about the dangers of net neutrality regulation. They argue that:
The petitions under consideration at the FCC and the Markey net neutrality bill would set an entirely new course for U.S. broadband policy, marking every network bit and byte for inspection, regulation and possible litigation. Every price, partnership, advertisement and experimental business plan on the Net would have to look to Washington for permission. Many would be banned. Wall Street will not deploy the needed $100 billion in risk capital if Mr. Markey, digital traffic cop, insists on policing every intersection of the Internet.
And there’s another editorial in today’s WSJ by business author Andy Kessler entitled “Internet Wrecking Ball.” Kessler also points to the innovation-killing nature of NN regulation:
“With net neutrality, there will be no new competition and no incentives for build outs. Bandwidth speeds will stagnate, and new services will wither from bandwidth starvation. … The trick to an open and innovative Internet is not sneaky technical fixes nor more rules and regulations and bureaucracies to enforce them. The Internet will only expand based on competitive principles, not socialist diktat. The more we can do to clear a path, the greater our national wealth will be.”
Make sure to read both pieces.