Articles by Tim Lee

Timothy B. Lee (Contributor, 2004-2009) is an adjunct scholar at the Cato Institute. He is currently a PhD student and a member of the Center for Information Technology Policy at Princeton University. He contributes regularly to a variety of online publications, including Ars Technica, Techdirt, Cato @ Liberty, and The Angry Blog. He has been a Mac bigot since 1984, a Unix, vi, and Perl bigot since 1998, and a sworn enemy of HTML-formatted email for as long as certain companies have thought that was a good idea. You can reach him by email at leex1008@umn.edu.


“Stealing” Wi-Fi

by on November 23, 2004

A recent Slate article goes over the ins and outs of “stealing” your neighbors’ Internet connection via wi-fi:

Every techie I know says that you shouldn’t use other people’s networks without permission. Every techie I know does it anyway. If you’re going to steal–no, let’s say borrow–your neighbor’s Wi-Fi access, you might as well do it right. Step one: Lose the guilt. The FCC told me that they don’t know of any federal or state laws that make it illegal to log on to an open network. Using someone’s connection to check your e-mail isn’t like hacking into their bank account. It’s more like you’re borrowing a cup of sugar.

This techie doesn’t say that you shouldn’t use other people’s networks without permission. In fact, I deliberately leave my wi-fi network unprotected, in case my neighbors have problems with their service and need a backup. They also have wi-fi, and do the same. I have no idea if they’re doing it on purpose or don’t know any better, but in either event my mooching doesn’t seem to have bothered them.

Technically speaking, you probably are violating your ISP’s terms of service by “sharing” your connection. But those provisions are vague enough, (and, if interpreted literally, silly enough) that I don’t have any real qualms about ignoring them. It doesn’t cost them appreciably more to occasionally carry traffic from my neighbors, and the benefit of having a backup Internet connection in a pinch is substantial.

My geeky ex-co-workers would kill me for saying this, but I really don’t think it’s a big deal from a security standpoint. Yes, if you happen to have a determined hacker next door to you, opening your wireless network makes his job easier. But the fact is that there are only a few thousand determined hackers in the country. My chances of landing one of them as a neighbor is remote. And besides, being my neighbor would make them pretty easy to catch if they did something illegal with my connection.

Assuming you don’t have a determined hacker next door, locking your computer down isn’t that difficult. Turn off services you don’t need, like file sharing. If you must use a local network service, make sure you pick a decent password for it. And never send personal information like credit cards via email or over other non-encrypted channels. Really, you should be taking all those steps whether you’re leaving your wi-fi network or not– the unencrypted Internet is inherently secure, and you should assume any open service could be hacked and any data sent in the clear could be snooped.

So I say: share and share alike. Let your neighbors use your wi-fi service, and go ahead and use theirs. Information, after all, wants to be free.

HT: J. Lo.

Goodbye Gigahertz

by on October 15, 2004

The GHz race officially came to an end this week. No, really. Intel, who has held the speed crown for more than 5 years, has thrown in the towel, announcing that they would break the 4 GHz barrier… well, never.

This is a development that analysts have been predicting for years. Since the late ’90s, CPUs have been much faster than the memory and buses that feed the CPU with data. That means that more processor speed is mostly useless for the vast majority of data-intensive tasks. Worse, Intel cheated in designing the Pentium 4, ramping up the speed mostly by reducing how much the chip did on each cycle. The result was a chip that had a higher clock rate, but didn’t actually perform any better than slower-clocked chips that did more with each cycle.

The design of the Pentium 4 was driven by marketing, not engineering, considerations: GHz was an easily understood metric for judging processor speed, and so having the fastest chip was an effective selling point. But the reality has become so obvious that even marketing people can’t ignore it. If they had continued on their current path, they would have needed ever-more-elaborate cooling technologies to keep the chip from melting.

From now on, expect chipmakers to focus on greater parallelism– putting more than one processor on a chip, executing multiple instructions per cycle– and on non-performance features like reducing power consumption. Both IBM’s G5 (which is in Apple’s new Power Macs) and AMD’s x86-64 architecture do a better job of getting more performance out of fewer clock cycles. The shift to non-performance-related features is already apparent with Intel’s Centrino line, which is targeted at mobile devices and boasts lower power consumption and wireless features.

Against DRM

by on August 26, 2004

An important component of Apple’s iTunes Music Store and competitors from Microsoft, Real, and Sony, is “digital rights management.” Under DRM schemes, music or other content purchased online is encrypted in a way that only authorized devices or programs can read it, and tagged with rules indicating who the rightful owner is and what may be done with it. If it works as advertised, such schemes allow copyright holders complete control over how their content is used, even after that content is sold to consumers. In the case of iTMS, Apple limits how many computers are allowed to have a copy of each song, how many CDs with a given playlist can be burned, and which devices I’m allowed to offload my songs to (at present, only iPods).

Many analysts on both sides of the intellectual property debate blithely assume that DRM works, both from a technological and a business perspective. They assume that DRM can prevent unauthorized copying of protected works, and (more crucially, in my view) that doing so makes business sense. I’m going to argue that both of those propositions are wrong.

Continue reading →

The Internet Gets Smaller

by on August 17, 2004

Yesterday Braden linked to this Wired News story:

Effective with this sentence, Wired News will no longer capitalize the “I” in internet. At the same time, Web becomes web and Net becomes net. Why? The simple answer is because there is no earthly reason to capitalize any of these words. Actually, there never was. True believers are fond of capitalizing words, whether they be marketers or political junkies or, in this case, techies. If It’s Capitalized, It Must Be Important. In German, where all nouns are capitalized, it makes sense. It makes no sense in English. So until we become Die Wired Nachrichten, we’ll just follow customary English-language usage. (Web will continue to be capitalized when part of the more official entity, World Wide Web.)

This is confused. First off, “Net” is just short for Internet, and probably shouldn’t be used in formal writing at all. When it is used, it should be preceded by an apostrophe.

“The Internet” is the name of a specific computer network, and it should be capitalized because it’s a proper noun– the same way “Sun” and “Moon” are proper nouns. (interestingly, my Chicago style manual says that “Earth” is only capitalized when it’s not preceded by “the”)

The web is arguably not a proper noun– it’s a conceptual grouping of content that’s served using the HTTP protocol. It’s not quite as obviously the name of a single, distinct thing as “the Internet” is. So I’m open minded about whether the web should be capitalized. However, I completely fail to see the logic for capitalizing “World Wide Web” but not “web.” If the one is a proper noun, so is the other.

We don’t capitalize “Internet” to make a statement about how important it is, but because in our language, proper nouns get capitalized. Since Wired doesn’t expand on what exactly “customary English-language usage” requires, it’s hard to know what their argument is. Wired seems to be trying to make some kind of political statement here, but it seems like it falls flat to me.

There has been a string of stories about hackers cracking the copy-protection features of Apple’s proprietary suite of music hardware and software. The most momentous was the news that Real had figured out how to place its own copy-protected songs on iPods, without any cooperation from Apple.

I think Apple’s response was incredibly short-sighted. Steve Jobs, Apple’s CEO, appears to be of the attitude that he can single-handedly conquer the digital music market. Aside from one-sided rebranding agreements, Apple has refused to let anyone under the iTunes tent.

Apple, clearly, has not learned from its own history. This has clear parallels to the biggest platform battle of Apple’s corporate lifetime– the battle with Microsoft for dominance of desktop computing. There, as here, Apple pursued a strategy of trying to build everything itself. Microsoft, in contrast, licensed its technology freely to all comers. In the process, Microsoft built a thriving and competitive ecosystem of PC hardware manufacturers, each of which had to pay Microsoft tribute in order to run Windows. Apple, meanwhile, spent most of the last two decades trying to invent everything in-house, and Steve Jobs strangled Apple’s one tentative attempt at platform openness in its cradle when he returned to Apple’s helm in 1997. As a result, Macs today have a dismal 3% market share and have been relegated to being the niche favorite of creative professionals and yuppies.

Apple looks determined to do the same thing with its current commanding lead in the music market. Microsoft and Sony are veterans of brusing platform battles, and they’re coming with war chests of billions of dollars to take Apple’s cozy music monopoly. Apple needs all the allies it can get in that battle. It should be locking in favorable terms now with anyone willing to take its side, not snubbing potential allies at every opportunity.

Steve Jobs has never shown himself to be a great strategic thinker. He’s a smart guy who thinks it’s cool to run a computer company and a movie studio. But he lacks Bill Gates’ appetite for world domination. and paradoxically, that makes him more–not less–of a control freak. The problem is that peaceful co-existence is rarely an option in the technology business. Either your platform comes out on top, or someone else’s does so and you get relegated to obscurity.

Fortunately, Real appears to be doing to Apple what Microsoft did to IBM in the 80’s– pry their platform open against their will. If Real wins the coming legal battles, the iPod and iTunes will be open whether Apple likes itor not. That just might be a blessing in disguise.