Articles by Tim Lee

Timothy B. Lee (Contributor, 2004-2009) is an adjunct scholar at the Cato Institute. He is currently a PhD student and a member of the Center for Information Technology Policy at Princeton University. He contributes regularly to a variety of online publications, including Ars Technica, Techdirt, Cato @ Liberty, and The Angry Blog. He has been a Mac bigot since 1984, a Unix, vi, and Perl bigot since 1998, and a sworn enemy of HTML-formatted email for as long as certain companies have thought that was a good idea. You can reach him by email at leex1008@umn.edu.


On Knowing Your Subject

by on January 25, 2007 · 2 comments

“Cog” at The Abstract Factory has a pretty damning critique of the Hahn/Litan paper’s citation of RFCs. He considers each of the four RFCs in turn and makes a pretty compelling argument that none of them say what Hahn and Litan say they say.

Before I get to Cog’s post, I should note that I was wrong in saying they didn’t cite the RFCs in question. There is, apparently, a short version and a long version of the paper, and I was looking at the shorter version, from which the footnotes are omitted. But they did offer them in the full version.

Anyway, I’ll just quote Cog’s comments on one of the four RFCs Hahn and Litan quote:

Continue reading →

Bad News for Apple

by on January 24, 2007 · 14 comments

From the Financial Times:

Apple was dealt a blow in Europe on Wednesday when Norway’s powerful consumer ombudsman ruled that its iTunes online music store was illegal because it did not allow downloaded songs to be played on rival technology companies’ devices.

The decision is the first time any jurisdiction has concluded iTunes breaks its consumer protection laws and could prompt other European countries to review the situation.

The ombudsman has set a deadline of October 1 for the Apple to make its codes available to other technology companies so that it abides by Norwegian law. If it fails to do so, it will be taken to court, fined and eventually closed down.

Apple, whose iTunes dominates the legal download market, has its proprietory system Fairplay. Songs and tunes downloaded through iTunes are designed to work with Apple’s MP3 player iPod, but cannot be played on rival devices.

Although I applaud the goal of increasing competition in the legal download market, I don’t think having Norwegian bureaucrats overseeing Apple’s software development process is a good solution. But as I’ve pointed out before: this should be a surprise. Regulatory schemes like the DMCA (and the EUCD in Europe) frequently have unintended consequences. And often, those unintended consequences are often cited as a justification for enacting more regulations to mitigate the harms caused by the previous round of regulation.

What we need, instead, is deregulation: Congress should repeal the anti-circumvention provisions of the DMCA, so that companies are free to reverse-engineer Apple’s products in order to build compatible devices.

More on the Hahn/Litan Paper

by on January 24, 2007

A couple of other quick points about the Hahn/Litan paper:

  • Throughout the paper, the authors fail to distinguish between neutrality as a means and neutrality as an end. The standard argument for regulation isn’t that all Internet services must operate at precisely the same speed. It’s that certain means of advantaging some traffic over others–namely, network providers setting up routing policies that prioritizes incoming traffic based on who has paid extra for the privilege–will be damaging to the Internet as a whole. You can agree or disagree with that premise, but I don’t think it’s that hard of a point to understand. And it obviously doesn’t implicate services like Akamai, which aren’t network providers at all, and who achieve “non-neutral” ends through scrupulously neutral means.
  • The paper’s citation of Ed Felten is a little bit odd. They describe him as a “proponent of the end-to-end principle,” which he is, but they fail to mention that he ultimately comes down against new regulations. I think that’s unfortunate, because I think Felten’s line of argument–that discrimination is a complicated concept, and writing a good neutrality rule will be a lot more difficult than people expect–is pretty compelling. Indeed, it’s precisely the sort of argument that should be old hat to two old hands at analyzing regulatory issues, an arena where the law of unintended consequences is constantly raising its ugly head. So it’s a little strange that the authors would implicitly lump Felten in with Wu and Lessig as a proponent of new regulation, rather than citing him as one of the most articulate skeptics of new regulation.

    I should emphasize that I agree with Hahn and Litan’s policy conclusion. And I certainly think it’s possible that priority-based pricing will be beneficial, and that’s a reasonable argument against premature regulatory intervention. But it doesn’t strike me as very likely, and I think the debate would be enhanced if those who did think it was likely (on both sides of the debate) paid a little bit more attention to the details of how it would actually work. I think that if they did so, supporters of regulation would find that it wasn’t as big a threat as they’d imagined, and critics would find that discrimination won’t solve as many problems as they hope it will.

  • The AEI-Brookings Joint Center has a new paper on network neutrality regulation.

    The economic logic of the paper is impeccable; price discrimination often benefits consumer because it allows service providers to provide premium services to those willing to pay, while giving other consumers the option of bare-bones service at cut-rate prices. The example they use is airline seats: both first class and coach passengers benefit from the airlines’ price discrimination–those who care about comfort get a nicer ride, while those who care more about price get a cheaper ticket.

    But like most people commenting on this issue–on both sides of the debate–Hahn and Litan are frustratingly vague about how exactly the price discrimination regime would work. For example:

    Continue reading →

    I’m reading the briefs leading up to the Ninth Circuit’s Kahle decision, (which was handed down this week) and I found this passage, from the government’s motion to dismiss at the district court level, striking:

    Under the 1909 Act, a copyright holder could secure a 28-year renewal term only after filing a renewal registration with the Register of Copyrights in the last year of the first 28-year term of protection. S. Rep. No. 102-194, at 3 (1992). “In 1976, Congress concluded years of debate and study on all aspects of the Copyright Act by passing a comprehensive revision to the 1909 law.” Id. Congress identified the copyright renewal revision as “[o]ne of the worst features of the present copyright law.” H.R. Rep. No. 94-1476, at 134. “A substantial burden and expense, this unclear and highly technical requirement results in incalculable amounts of unproductive work. In a number of cases it is the cause of inadvertent and unjust loss of copyright.”

    So Congress found in 1976 that requiring authors to file for the renewal of their own works was an unjustified administrative nightmare. This, the government argues, justified scrapping the registration requirement. This despite the fact that this burden and expense is spread across thousands of different authors, and despite the fact that authors know better than anyone else which works they own and which works are still commercially viable.

    Continue reading →

    Joe Consumer, Beta Tester

    by on January 23, 2007 · 4 comments

    Ken Fisher at Ars has a great article on the flaws in HDCP, the copy protection scheme that “secures” most high-definition devices these days:

    This stuff doesn’t work reliably for even the basic stuff like showing video flawlessly, let alone securing outputs. I even have a HDCP/HDMI issue with my TiVo, which decides that my TV is no longer secure about once a month, requiring a reboot.

    Stranger reports have arisen from PlayStation 3 owners who are experiencing blinking displays when connected to some HDTV sets. When playing games, occasionally the sound cuts out and the entire display would blink on and off. As it turns out, the HDCP technology in the PS3 would freak out and sputter if a connected TV could not consistently and quickly indicate it was copy-protection ready. No one knew that this was the case until the guys at Popular Mechanics pinned the tail on the donkey.

    Continue reading →

    In Defense of Brain Drain

    by on January 22, 2007 · 14 comments

    Related to our discussion a couple of weeks ago about immigration for high-tech workers, Katherine Mangu-Ward cites a study illustrating one way that a “brain drain” can be good for the country from which the brains drain:

    Imagine, if you will, foreign movie makers who come to California. They are much more likely to make excellent movies there–or even to make movies at all, really–and more of their countrymen will get to watch them when they appear, especially if their countrymen have few qualms about bootlegs.

    The authors, economists Peter J. Kuhn and Carol McAusland, write that those who remain behind “benefit because ‘their’ brains produce ‘better’ knowledge (such as more effective medicines, more entertaining movies, or more effective software) abroad than if they had remained at home.” This is particularly true in situations where a discrepancy between protections for intellectual property at home and abroad makes it easy for residents of the innovators’ countries of origin to enjoy the fruits of their labors with low transaction costs.

    Personally, I find the notion that someone should be forced to live in an impoverished country solely so that his countrymen can benefit from his presence morally repugnant. But even if you buy that premise, it’s not at all clear that liberal immigration of high-skilled workers is, on net, harmful to poor countries. And liberal immigration is undeniably beneficial to the world as a whole.

    Good Riddance to Print

    by on January 22, 2007

    Ezra Klein shrugs at the decline of the traditional American daily:

    Newspapers currently expend a fair number of resources doing certain things very poorly, or replicating the efforts of other organizations. That was fine when the information junkie had few alternatives. It’s less so when the world offers limitless avenues for data accumulation.

    But all this really means is that newspapers will begin following magazines and specialty newspapers (like The Wall Street Journal) and seeking to make themselves indispensable to certain audiences. Some of those audiences may be ideological, and you’ll see campaigning newspapers akin to the British Guardian or Fox News. Some will be professional, and you’ll see dedicated foreign bureaus that do nothing save in-depth reporting on global issues, in much the way National Journal does for Congress. All will be, in their way, more relevant. The bloodless, fearful paradigm of “objective” reporting has alienated all while informing none, and it will likely come to a close.

    Continue reading →

    IT&T News is a great publication that features many excellent articles by a variety of free-market policy experts. But I found this article on e-voting, by PRI’s Vince Vasquez, rather disappointing:

    The e-voting experience has been a resounding success that has generated relatively few complaints from the electorate. To be sure, there were some legitimate problems with DRE machines on November 7, but many have been found to be man-made, such as innocent user error, inept poll workers, or ineffective planning by local election authorities. Unfortunately, these human-based fumbles have opened the doors for open-source zealots, wide-eyed activists, and crafty politicians who want to scrap DREs for the 2008 elections.

    I’m not a politician, and I actually don’t think that open source would solve what’s wrong with e-voting, so by process of elimination, I must be a “wide-eyed activist.” I bet Ed Felten and Avi Rubin–both widely respected computer scientists–would be surprised to learn that they, too, are “wide-eyed activists.”

    After busting out that sort of inflammatory rhetoric, you would think that Mr. Vasquez would have some pretty compelling refutations of us wide-eyed activists. But he doesn’t even mention–much less address–any of the actual arguments that e-voting critics make against computerize voting. No mention of the fact that DREs are less transparent, harder to audit, or more susceptible to wide-scale (rather than local) fraud than paper ballots. No mention of the current debacle in Florida, the various reports of problems with e-voting machines, or the fact that computer security researchers have actually demonstrated that some e-voting machines are vulnerable to vote-stealing viruses.

    Nope, all we get is vague arguments about how “digital red tape and risky industry requirements jeopardizes the value of these innovative machines.” (Why are they innovative? Because there are computers in them!) And overheated rhetoric about “feeding the country’s voting system to ideological lions.” There might be some good arguments for using DREs, but Mr. Vasquez doesn’t seem to have any.

    Welcome Brooke Oberwetter

    by on January 22, 2007 · 20 comments

    I’m excited to announce that Brooke Oberwetter is joining the TLF team. Brooke has been a friend of mine since we worked together at Cato. She’s one of the sharpest and funniest people I know. Brooke earned my admiration for her tireless (and sadly, futile) fight to stop the smoking ban in DC. Also, with the possible exception of Julian, she throws the best parties in DC.

    And (despite my occasional nitpicking) she has many interesting and worthwhile things to say about tech policy. She’s a policy analyst at the Competitive Enterprise Institute, and she tells me her work at CEI will be more focused on tech policy in the coming months. She’s currently seeking a masters degree in public policy at American University, and she also blogs at the CEI blog and her personal blog.