I’ve been trying to keep tabs on the status of various municipal wi-fi experiments going on across the nation by posting local news reports about them whenever I see them. The results so far have not been encouraging, but this hasn’t been that surprising since those of us who study these issues know that most wireline muni experiments failed too.
And speaking of failed wireline experiments, it appears there’s another one that might soon be added to the list. The Utah Telecommunications Open Infrastructure Agency–or “UTOPIA” as it is known–was created in 2002 by local Utah officials who wanted to bring high-speed Internet access to their communities. Eleven communities pledged roughly $200 million over 20 years to back the bonds needed to finance the construction of advanced fiber-optic facilities. Utilimately, the goal was to ensure inexpensive broadband for the masses at minimal cost to taxpayers.
But there are problems in paradise. According to this recent article by Steve Oberbeck of The Salt Lake Tribune:
[F]our years after 11 Utah cities… pledged to financially back the UTOPIA system, its finances are in shambles. Construction is behind schedule. Its top promoters have quit, and its newest chairman has uttered the unthinkable – that despite promises to the contrary, the cities that pledged their support eventually may have to pony up hundreds of millions in taxpayer dollars to prop up the system.
What went wrong?
Continue reading →
I’ve concluded that one of the central fault lines in the network neutrality debate is over the extent to which physical ownership of a data pipe gives an owner the practical ability to exert fine-grained control over the use of that pipe. There’s an implicit assumption on the pro-regulation side of the debate that if AT&T owns your DSL line, then it has the physical ability to, say, prohibit you from watching online videos or require you to use their email or VoIP services. Lessig and Lemley, for example, made this point repeatedly in their 2000 paper without ever explicitly justifying it. For example:
Under the design proposed by the cable broadband, AT&T and Time Warner affiliates would have the power to decide whether these particular services would be “permitted” on the cable broadband network. Cable has already exercised this power to discriminate against some services.
This is backed up by a footnote citing various restrictions mentioned in @Home’s terms of service. But as I noted previously, the fact that @Home’s terms of service formally prohibited some category of network activities did not mean that, as a practical matter, users were unable to take advantage of that service. To the contrary, it’s extremely common for users to use their network connections in ways explicitly prohibited in the terms of service, and ISPs have struggled to crack down on those who do so. The fundamental issue is that classifying traffic is a very hard problem, one that almost certainly can’t be solved in the general case. Which means that any automated filtering regime can be circumvented. And of course, having human beings monitor every user’s traffic and impose restrictions on those who violate the terms of service would be far too labor-intensive to be worth the trouble. So ISPs are forced to resort to extremely crude tactics to accomplish their filtering goals, and these tactics, in turn tend to produce both a lot of bad PR and the emergence of new, more sophisticated evasion tools. In the long run, it’s not at all clear to me that this is a battle ISPs could win, even if they had free rein to implement any policies they wanted without fear of regulatory intervention.
Continue reading →
Matt Lasar has put together a very entertaining article illustrating how “Faux Celebrity FCC Filings [are] on the Rise.” What he’s referring to is the fact that just about anyone can file comments with the FCC, even fake celebrities or dead historical figures.
The whole process has become a complete joke. Some of my research on the FCC’s indecency complaint process has illustrated how one group–the Parents Television Council (PTC)–has essentially been able to stuff the complaint ballot box at the FCC by filing endless strings of computer-generated complaints from its website. The PTC then fires off letters to the FCC and Congress that essentially say, “Look! Millions of Americans out outraged by the content on TV and are clamoring for regulation!” In turn, that nonsense gets included in the congressional record when legislation is introduced, and politicians claim “the American people have spoken” and are overwhelming in favor of regulation.
It’s all nonsense, of course, because the vast majority of those “complaints” were just the same PTC form letter. But the same games are at work in the debates over media ownership policy and Net neutrality regulation. Jerry Brito and Jerry Ellig have shown that, in the FCC’s Net neutrality proceeding, “Close to 10,000 comments were submitted to the FCC, yet all but 143 were what the FCC calls “brief text comments,” many of which were form letters generated at the behest of advocacy groups.” The same thing is at work in the media ownership debate. A couple of radical anti-media activist groups stuff the ballot box with computer-generated complaints. And the Washington Post recently ran a piece raising questions about how the public filing process is potentially being abused in the XM-Sirius merger fight.
But Matt Laser documents how truly absurd this process has become when the likes of Paris Hilton, Donald Trump, Joseph Stalin, and even Jesus Christ end up submitting “comments” for the “public record.” Here’s some of the highlights from Lasar’s writeup:
Continue reading →
Bravo for Larry Downes of ZD Net who has a smart new column out today entitled “Save Internet Freedom–from Regulation.” Downes is referring to the ominous threat posed to the future of the Internet by the Net neutrality bill that Rep. Edward Markey (D-MA) is likely to introduce shortly. Downes points out that:
The Internet has thrived in large part because it has managed to sidestep a barrage of efforts to regulate it, including laws to ban indecent material, levy sales tax on e-commerce, require Web sites to provide “zoning” tags, and to criminalize spam, file sharing, and spyware. Some of these laws have been overturned by the courts; some died before being passed; and the rest–well, the rest are effectively ignored, thanks to the Internet’s remarkable ability (so far) to treat regulation as a network failure and reroute around the problem.
Exactly right. Why then, Downes asks next, “do the same civil-liberties groups that recognize the value of keeping the government out of Internet content want to open a loophole large enough to drive several Mack trucks through?” GREAT question, and one that we’ve been asking on this site for many years.
Continue reading →
Lessig and Lemley also introduce an argument that seems to me to be fundamentally in tension with their broader end-to-end thesis:
One should not think of ISPs as providing a fixed and immutable set of services. Right now ISPs typically provide customer support, as well as an IP address that channels the customer’s data. Competition among ISPs focuses on access speed, as well as some competition for content. AOL, for example, is both an access provider and content provider. Mindspring, on the other hand, simply provides access. In the future, however, ISPs are potential vertical competitors to access providers who could provide competitive packages of content, or differently optimized caching servers, or different mixes of customer support, or advanced Internet services. This ISP competition would provide a constant pressure on access providers to optimize access.
I don’t agree with this, and indeed, I don’t think Lessig himself agrees with this any longer, although that may be a consequence of today’s less-competitive ISP marketplace. But it seems to me that the end-to-end principle
does imply that we should “think of ISPs as providing a fixed and immutable set of services”: namely, moving bits from point A to point B without doing much else. While there’s nothing intrinsically wrong with an ISP offering other services besides that, the division of labor would seem to suggest that it’s generally going to work better for ISPs to offer basic service and third parties to provide caching, content, or “advanced Internet services.”
And indeed, that’s what has happened. Akamai, for example, was in its infancy when L&L were writing their paper. In the last seven years the company has thrived, despite the trend toward vertically integrated residential ISPs. By the same token, ISPs still provide their customers with email and web services, but it’s become far more common for users to bring their own email access and find third parties (including Flickr, Blogger, and YouTube) to host their web content. Even DNS, long considered core functionality of an ISP, is increasingly being offered by third parties.
Most dramatically, with AOL’s transition to being just another web portal, the business model of ISP-as-content-provider has completely collapsed. Hardly anyone now believes that it makes sense for your ISP to be a major provider of Internet content. ISPs should be competing on the basis of their ability to bring the cornucopia of content already on the web to you as efficiently as possible, not on their ability to provide an inevitably meager quantity of exclusive content on top of basic Internet access.
This probably wasn’t as obvious in 2000 as it is today. I haven’t seen Lessig or Lemley specifically address the point, but given Lessig’s enthusiastic embrace of network neutrality regulation (which is based on the implicit premise that ISPs
shouldn’t be more than a “dumb pipe”), I would bet he’d concede that value-added ISPs aren’t as promising a concept as they appeared a decade ago.
Way back in 2000, Larry Lessig and Mark Lemley wrote The End of End-to-End: Preserving the Architecture of the Internet in the Broadband Era. It’s interesting because it underscores how rapidly the broadband debate has evolved in this decade. At the time Lemley and Lessig were writing, the big issue was whether cable companies would be required to unbundle their cable Internet service the same way phone companies were required to unbundle their DSL service. Since then, of course, the FCC has no only declined to unbundle cable lines, but has abandoned unbundling of DSL lines as well.
Lessig and Lemley were on the other side of this issue, warning that allowing cable companies to offer only integrated cable Internet service threatened to undermine the end-to-end principle of the Internet:
The consequence of this bundling will be that there will be no effective competition among ISPs serving residential broadband cable. The range of services available to broadband cable users will be determined by the “captive” ISPs owned by each local cable company. These captive ISPs will control the kind of use that customers might make of their broadband access. They will determine whether, for example, full length streaming video is permitted (it is presently not); they will determine whether customers might resell broadband services (as they presently may not); it will determine whether broadband customers might become providers of web content (as they presently may
The third has clearly happened, although it’s not clear to me that that’s a great loss, since third-party web hosting is an extremely competitive market. There’s no obvious reason for people to run servers out of their homes and some good reasons for them not to. The second has happened in theory, but not really in practice. I have a wireless access point connected to my cable modem despite the fact that my terms of service most likely requires that I only connect a single computer. Charter hasn’t given me a hard time about it. So what your ISP requires in theory and what they enforce in practice can be very different questions.
Continue reading →
22nd-century scholars are going to find the history of AT&T around the turn of the 21st century absolutely baffling, on par with British schoolchildren having to keep track of Henry VIII’s wives. I’m reading a paper from 2001, and I did a double-take when it talked about AT&T and Time Warner as the major players in the cable industry. Then I remembered that this was the post-breakup, pre-spinoff, pre-merger AT&T–the one that was in the cable and long-distance markets. Which is basically a completely different company from the AT&T that’s now in the local telephone, DSL, and rent-seeking markets. I’m sure in another 20 years there will be a totally different company called AT&T that will be in charge of issuing me my REAL ID card and operating the terrorist surveillance cameras on every street corner.
Kevin Werbach’s “Only Connect” got quite a bit of attention in the blogosphere when it was unveiled, including a post here on TLF. The attention was well deserved. The paper does an excellent job of explaining what’s at stake in the network neutrality debate and elucidating the positions staked out by each side. His discussions of the complexities of discrimination, access tiering, quality-of-service, etc in sections III(B) and III(C) are especially well done. He seems more keenly attuned than most scholars to the challenges that a regulator tasked with enforcing a non-discrimination rule would face.
With that said, I think the paper suffered from a fundamental conceptual weakness that left me unpersuaded by his ultimate thesis: I wasn’t ultimately convinced that interconnection and non-discrimination are separate and distinct regulatory issues. To the contrary, I think the two are often intimately connected. An effective interconnection mandate almost always depends on ensuring that the terms of interconnection are non-discriminatory. If network owner A is forced to interconnect with network owner B against its will, there are a variety of ways A can retaliate by charging B unreasonable prices, dropping B’s packets, dragging its feet on installing B’s equipment, etc. In practice, a practical interconnection mandate will invariably require some network-neutrality-like regulations to make it effective. The converse is equally true: a legal rule mandating non-discriminatory routing policies is likely to require some regulation of interconnection terms in order to ensure that the regulated carrier doesn’t discriminate through the back door by only offering low-quality links to those carriers against whom it wishes to discriminate.
Continue reading →
I’m in the midst of a big writing project on network neutrality, and so I’m going to do a series of posts on papers I’ve been reading. Some of the material in these posts may find its way into the forthcoming paper. I’m going to start with “A Coasian Alternative to Pigovian Regulation of Network Interconnection,” a paper by two FCC economists, that purports to offer an alternative to the FCC’s current inter-carrier compensation regime whereby long-distance firms pay local exchange carriers to terminate calls to the LEC’s subscribers. I’m not specifically interested in telephone regulation, but Atkinson and Barnekov suggest their arguments apply to other networks as well, and they’re cited by others (including Kevin Werbach, whom I’ll discuss in a future post) in the network neutrality debate, so I thought it was worth reading.
It seems to have become trendy to label one’s policy prescriptions “Coasian,” and that’s how Atkinson and Barnekov frame their analysis. They argue that the FCC’s current compensation regime is “Pigouvian” because a government bureaucrat dictates the prices that network owners must pay each other for the privilege of interconnection. Under Atkinson and Barnekov’s alternative, the FCC would… dictate the prices that network owners must pay each other for the privilege of interconnection. But they think they have a formula that is less arbitrary than the formula currently being used, and would therefore better approach the Coasean ideal of clearly-defined property rights.
In a nutshell, when one network owner wished to connect with another network owner, Atkinson and Barnekov would have them calculate the total cost of interconnection and then split it down the middle. This total cost would not just include the costs of interconnection at the edge of the network (say, stringing fiber between their facilities) but also the increased cost imposed
inside each network, such as the additional capacity one network would need to carry the other’s traffic. This total cost would be computed, it would be divided by two, and then one party would pay the other so that each bore half the total cost.
Continue reading →
Two commentators tried to <a href= argue”>http://article.nationalreview.com/?q=MGQ4NGJjYjMwMTgzMzgwMmIxYjkyMTNkNWYxNjU2MzA=>argue that FCC Chairman Kevin J. Martin has held true to conservative principles nowithstanding recent attempts to re-regulate the cable industry. Cesar V. Conda and Lawrence J. Spiwak posited that a “pro-entry/pro-consumer-welfare mandate” is the very “hallmark of economic conservatism.” This is a bizarre statement.
“Pro-entry” is a euphemism for competitor welfare, the antithesis of consumer welfare. Competitor welfare used to be the guiding principle of antitrust law – a legacy of the populist movement. The idea was that more competitors equaled stronger competition. It’s intuitively appealing, but it confuses quantity with quality and is wrong if the competitors are inefficient. Protection of inefficient competitors is a form of subsidy.
For example, the Clinton FCC tried to jumpstart competition in telecom with a “pro-entry” policy which allowed startups to lease facilities and services below cost from incumbent providers like AT&T and Verizon. You might think that’s no big deal, AT&T and Verizon can probably afford it. But the truth is they don’t absorb such losses, they pass them on to their remaining customers.
Continue reading →