Articles by Tim Lee

Timothy B. Lee (Contributor, 2004-2009) is an adjunct scholar at the Cato Institute. He is currently a PhD student and a member of the Center for Information Technology Policy at Princeton University. He contributes regularly to a variety of online publications, including Ars Technica, Techdirt, Cato @ Liberty, and The Angry Blog. He has been a Mac bigot since 1984, a Unix, vi, and Perl bigot since 1998, and a sworn enemy of HTML-formatted email for as long as certain companies have thought that was a good idea. You can reach him by email at leex1008@umn.edu.


Google’s new blog has a post laying out their position on network neutrality. I’m probably missing something, but it strikes me as rather incoherent:

What kind of behavior is okay?

  • Prioritizing all applications of a certain general type, such as streaming video;
  • Managing their networks to, for example, block certain traffic based on IP address in order to prevent harmful denial of service (DOS) attacks, viruses or worms;
  • Employing certain upgrades, such as the use of local caching or private network backbone links;
  • Providing managed IP services and proprietary content (like IPTV); and
  • Charging consumers extra to receive higher speed or performance capacity broadband service.
  • On the other hand:

    What isn’t okay?

  • Levying surcharges on content providers that are not their retail customers;
  • Prioritizing data packet delivery based on the ownership or affiliation (the who) of the content, or the source or destination (the what) of the content; or
  • Building a new “fast lane” online that consigns Internet content and applications to a relatively slow, bandwidth-starved portion of the broadband connection.
  • So if Verizon builds a 30 Mbps pipe to consumers’ homes, and allocates 25 Mbps to a proprietary IPTV service (“Providing managed IP services and proprietary content”) and 5 Mbps to public Internet traffic, is that OK? What if they then consign all video traffic (“all applications of a certain general type”) in the public Internet to the lowest priority, rendering it effectively unusable? And can they then syndicate content from third parties through their IPTV service?

    If so, I don’t understand what network neutrality is supposed to accomplish. If not, how am I mis-reading Google’s proposal?

    A 1909 EULA

    by on June 19, 2007 · 2 comments

    Apparently using elaborate licensing terms to extend the rights granted under copyright and patent law are not a new idea, nor are they limited to the software industry. From a record manufactered before 1909:

    This record which is registered on our books in accordance with the number hereon, is licensed by us for sale and use only when sold to the public at a price not less than one dollar each. No license is granted to use this record when sold at a less price. This record is leased solely for the purpose of producing sound directly from the record and for no other purpose; all other rights under the licensor’s patents under which this record is made are expressly reserved to the licensor. Any attempt at copying or counterfeiting this record will be construed as a violation of these conditions. Any sale or use of this record in violation of any of these conditions will be considered as an infrinement of our United States patents, Nos. 524543, dated February 19, 1895, and 548623, dated October 29, 1895, issued to EMILE BERLINER, and No. 739,318, dated September 22, 1903, and No. 778,976, dated January 3, 1905, and of our other U.S. patents covering this record, and all parties so selling or using the record, or any copy thereof, contrary to the terms of this license, will be treated as infringers of said patents, and will render themselves liable for suit.

    I don’t know enough about copyright history to be sure, but my guess is that the reason they talk so much about patent law is that I believe “mechanical reproductions” of music were not covered by copyright law until the 1909 Copyright Act. So record companies apparently attempted to use patent law plus some creative contract terms to create the contractual equivalent of copyright.

    I have the bad feeling that I’m going to find myself disagreeing with Larry Lessig a lot more in the next few years.

    Lessig did a post today announcing that he’s going to be re-orienting his research away from copyright issues:

    From a public policy perspective, the question of extending existing copyright terms is, as Milton Friedman put it, a “no brainer.” As the Gowers Commission concluded in Britain, a government should never extend an existing copyright term. No public regarding justification could justify the extraordinary deadweight loss that such extensions impose.

    Yet governments continue to push ahead with this idiot idea — both Britain and Japan for example are considering extending existing terms. Why?

    The answer is a kind of corruption of the political process. Or better, a “corruption” of the political process. I don’t mean corruption in the simple sense of bribery. I mean “corruption” in the sense that the system is so queered by the influence of money that it can’t even get an issue as simple and clear as term extension right. Politicians are starved for the resources concentrated interests can provide. In the US, listening to money is the only way to secure reelection. And so an economy of influence bends public policy away from sense, always to dollars.

    Now, I wholeheartedly agree with his assessment that lobbyists often corrupt the political process. And certainly copyright law—an issue on which I share almost all of his views—is a prime example of that. He’s quite right that there’s no plausible policy argument for retroactive copyright extension, yet Congress did it because of the lobbying might of the copyright lobby.

    Continue reading →

    Tom Lee suggests that I’m over-stating my case with regard to the innovativeness of free software:

    But I don’t think this lack of originality is due to any inherent flaw in open-source contributors or the organizational model they employ. I think it’s simply a question of capital — open source projects typically haven’t got any. The vast majority of applications benefit from network effects that arise when their userbase becomes large enough: suddenly it’s easier to find someone to play against online, or the documentation is better, or you can exchange files in the same format that your friend uses. It’s relatively easy for open-source projects to achieve the necessary level of market interest when dealing with highly technical users and applications, as Tim’s examples demonstrate — there are accepted techniques (e.g. the RFC process, making frequent commits to the project) and media outlets (e.g. listservs, usenet) that can confer legitimacy and generate interest without an investment.

    Continue reading →

    Here’s the other specific criticism of peer production you’ll find in Carr’s critique of peer production:

    But for all its breadth and popularity, Wikipedia is a deeply flawed product. Individual articles are often poorly written and badly organized, and the encyclopedia as a whole is unbalanced, skewed toward popular culture and fads. It’s hardly elitist to point out that something’s wrong with an encyclopedia when its entry on the Flintstones is twice as long as its entry on Homer.

    Carr doesn’t even have the basic facts right here. To start with, the Flintstones entry, at some 5672 words, is actually only about 50 percent longer than the Homer entry, with around 3822 words. But more to the point, the entry on homer includes links to entries on the Homeric Question (1577 words), Ancient accounts of Homer (1183 words), Homeric scholarship (4799 words), Homeric Greek (582 words), and The Historicity of the Illiad (1720 words). If my math is right, that’s 13,683 words, more than double the number of words in the Flintstone’s article. (The Flintstone’s article doesn’t appear to be divided up into sub-sections as the Homer article is, although there are entries on Flintstones-related topics, such as the characters in the show and the actors who played them. But on the other hand, there are also lengthy entries on The Iliad, The Odyssey, The geography of the Odyssey, and The Trojan War.

    Continue reading →

    Nick Carr has a lengthy article on the pros and cons of peer production. To some extent, it’s hard to quibble with his basic point that peer production is useful but not a panacea. Clearly, peer production doesn’t work for every project, and it will always rely on a core group of dedicated individuals who do a lot of the work.

    But what I found striking about the article is that it spends a lot of time asserting that peer-produced products have problems, but Carr provides hardly any examples. Below the fold, I’ll look at one of the few specific criticisms of free software, which I find to be seriously misguided.

    Continue reading →

    A reader sent me a link to the controversy last week over the blocking of Listpic by Craig’s List. Listpic was a service that allowed users to view Craig’s List ads as image thumbnails rather than as text ads:

    In an e-mail to Wired News, Craigslist CEO Jim Buckmaster explains the company’s position: “The 0.1% of our users who were accessing Craigslist images via Listpic were creating a grossly disproportionate drain on our server resources, degrading performance significantly for the 99.9% of our users accessing Craigslist in the normal fashion. Besides frequently hitting our site to harvest Craigslist user content for re-display on their site, each Listpic page load was causing our systems to serve up approx 100 full size images.”

    In a forum post on Craigslist, company founder Craig Newmark (look for his yellow name tag) echoed Buckmaster’s comments. And of course there’s the inconvenient little fact that ListPic was serving Craigslist data on an external website while selling advertising — a big no-no.

    The reader who emailed it to me suggested that this was an illustration of Craig Newmark’s hypocrisy for advocating network neutrality for telcos while discriminating against websites using Craig’s List content in ways they don’t approve of. I don’t think that’s quite right. I certainly think Craig’s support of Internet regulations is misguided, but I don’t think this really illustrates it. In the first place, network neutrality is a principle for ISPs to follow. No one has ever seriously suggested that it be applied to websites, and indeed, it’s not even clear what a neutrality policy for websites would even mean. Second, Listpic was not a Craig’s List customer, and so there’s no reason Craig’s List has any obligations at all to Listpic. Third, Craig appears to be claiming that Listpic wasn’t even serving up the images itself, but was hot-linking the images from Craig’s List’s servers—hot linking on such a large scale is universally regarded as a no-no.

    There are good arguments against government regulation of the Internet, but spurious accusations of hypocrisy against Craig’s List, Google, or anybody else is not among them.

    Hundreds of Billions?

    by on June 17, 2007 · 2 comments

    Julian pinpoints exactly what’s wrong with the ludicrous claim that copyright infringement is a bigger problem than ordinary property crime:

    There’s a big difference between a the cost of theft to an industry or firm and the cost to the country. If I steal your bike, I’ve cost you one bike, not America. America still has the same number of bikes in it. The cost to the country of the theft of physical resources is not the cost of the resources themselves, typically, but rather the efficiency loss of shifting those resources to less valued uses, the cost of resources expended preventing or prosecuting it, the opportunity cost of the effort expended on a zero-sum transfer, and so on. By contrast, piracy is actually positive sum in static terms: Nobody has any fewer programs, songs, or movies, while the pirate has (at least) one more. Nothing has been redirected to a lower-valued use. So the only actual loss in this case is the value of new IP that doesn’t get created because piracy prevents prospective creators from fully internalizing its value. (There may be further losses if we think piracy induces companies to raise the prices of their products, and there are consumers who are priced out of the legal market by this, but don’t avail themselves of pirate copies.) I don’t know how significant that number is—or even how you’d measure it accurately—but “hundreds of billions” just doesn’t pass the straight face test.

    A software industry study released a few years back tried to translate piracy losses into job losses in the tech sector. And again, even if we take that number at face value, just looking at one specific sector is worse than useless. By that mode of reckoning, Bastiat’s satirical “Petition from the Manufacturers of Candles” should be read as a serious guide to policy.

    Quite so. As I’ve argued before, attempts to inflate the costs of movie piracy fall equally flat. Even if we equate the lost revenue of the movie industry with the lost wealth to the country (which is almost certainly an overstatement), the losses to the movie industry are measured in the billions (about $6.1 billion, to be precise) not tens or hundreds of billions. And the software industry’s inflated figure is only $34 billion. “Hundreds of billions” isn’t even remotely plausible.

    Mormon Women for Sale

    by on June 17, 2007 · 0 comments

    I haven’t been able to think of a policy angle to the spat (via Matthew Ingram) between Google and eBay, but this is just too funny to pass up.

    mormon_women.png

    From Derek Slater:

    A judge ordered [PDF] the FBI today to finally release agency records about its abuse of National Security Letters (NSLs) to collect Americans’ personal information. The ruling came just a day after the EFF urged [PDF] the judge to immediately respond in its lawsuit over agency delays.

    EFF sued the FBI in April for failing to respond to a Freedom of Information Act (FOIA) request about the misuse of NSLs as revealed in a Justice Department report. As we noted yesterday, more evidence of abuse was uncovered by the Washington Post, and EFF urged the judge Thursday to force the FBI to stop stalling the release of its records on the deeply flawed program.

    I bet we’ll learn all sorts of interesting things from these records!