September 2006

Skype as a Bandwidth Hog?

by on September 25, 2006 · 20 comments

Ars has an interesting story about three California colleges that have decided to ban Skype from its campus. The school administrators have what strikes me as a puzzling attitude toward the service, describing it as a “potentially illegal waste of resources,” without explaining what might be illegal about it. Perhaps they’ve somehow gotten the erroneous impression that there’s something inherently illicit about “grid-computing-like” network applications.

Aside from legal concerns, the other issue seems to be bandwidth:

according to the Office of Information Technology, the chief problem comes when a Skype client acts as a “supernode” and makes itself available to relay calls made by other users. Having numerous supernodes on a school network increases bandwidth consumption and has a detrimental impact on connectivity, according to the memo. Anecdotal reports from individual Skype users reveal that bandwidth consumption can increase by as much as an entire gigabyte per month for a single Skype client when it acts as a supernode.

If my math is right, 1 gigabyte per month is roughly 3 kilobits per second, a trivial amount of bandwidth on a modern campus network. Even if the bandwidth is concentrated in shorter bursts–say, if the whole gigabyte is transmitted in a single hour–that’s still a rate of only 2.2 megabits per second–roughly the bandwidth of a typical DSL line. This is not a particularly abusive use of the network.

Rich Gordon emailed to point out this multimedia report about government surveillance. Of particular interest is this interactive feature on the government’s many existing surveillance programs. There are dozens and dozens of them, touching virtually every aspect of our lives.

Most of them seem pretty innocuous individually. For example, government surveillance of large currency transactions probably strikes most people as harmless. But as you go down the list, it becomes obvious that the whole has the potential to be a lot more than the sum of its parts. If the government tracks you every time you visit your bank, every time you get on an airline, every time you apply for financial aid, every time you apply for a driver’s license, every time you apply for a credit card, and on and on, pretty soon the government has a bit of data about almost every facet of your life.

Moreover, those are just the programs the government admits to. The press has uncovered two secret programs that engage in surveillance via the telephone network. And there are doubtless others that have not yet been uncovered.

The sheer complexity of these widely varied programs makes it especially difficult for grassroots action to deal with. If there were a single Big Brother program, the ACLU or EFF might be able to organize a grassroots backlash against it. But developing backlashes against Big Uncle, Big Cousin, Big Sister, and dozens of other piecemeal intrusions on our privacy is much more difficult. You kill one head of the hydra, and three more sprout up in its place.

By sheer coincidence, I’m currently (re-)reading Hayek’s The Constitution of Liberty, which I recommended to Luis in a recent post. I thought this passage was interesting:

The importance of our being free to do a particular thing has nothing to do with the question of whether we or the majority are ever likely to make use of that particular possibility. To grant no more freedom than all can exercise would be to misconceive its function completely. The freedom that will be used by only one man in a million may be more important to society and more beneficial to the majority than any freedom that we all use.

It might even be said that the less likely the opportunity to make use of freedom to do a particular thing, the more precious it will be for society as a whole. The less likely the opportunity, the more serious will it be to miss it when it arises, for the experience that it offers will be nearly unique. It is also probably true that the majority are not directly interested in most of the important things that any one person should de free to do. It is because we do not know how individuals will use their freedom that it is so important. If it were otherwise, the results of of freedom could also be achieved by the majority’s deciding what should be done by the individuals. But the majority action is, of necessity, confined to the already tried and ascertained, to issues on which agreement has already been reached in that process of discussion that must be preceded by different experiences and actions on the part of different individuals.

Continue reading →

The Long Tail of Politics

by on September 24, 2006

Via Mike Linksvayer, I see that Nick Gillespie has a new interview with Chris Anderson. Anderson “laments that national politics has yet to become part of the Long Tail,” to which Mike responds:

The real long tail of politics isn’t about elections at all. Even if I can vote for my ideal candidate, or vote directly on every issue, at the end of the day I will still get policies approximating those of George W. Bush and John Kerry. That’s like being able to order any of millions of books at Amazon but always getting the current #1 best seller delivered regardless of your order.

The real long tail of politics is decentralization and arbitrage. Lots of people say “Bush isn’t my president.” Why can’t that be true? Declare yourself Venezuelan, Hugo Chavez is your president. It should be (almost) that easy. If that seems extreme and disruptive, at least executive power should be curtailed, for surely it is the antithesis of long tail politics. And being able to live and work in any jurisdiction should be a given.

Now, I don’t think this would work exactly as he describes it. If Mike declares himself Venezuelan and steals my hubcaps, I still want the American police to arrest him, rather than waiting for Venezuelan police to fly up and deal with it. But this is an interesting way to think about federalism. One of the great virtues of the American political system is that left-wingers can move to San Francisco or Boston and get policies they generally like, while right-wingers move to Salt Lake City or Birmingham to get the kind of government they want. To some extent, federalism allows us to have the same kind of diversity in government that we’re used to getting from the market. We don’t all listen to the same music or eat the same food. Why shouldn’t we have the same kind of choice in politics?

Of course, no matter where we Americans live, we all have to put up with the decisions of the bozo in the White House. Which is why I think it’s so important to move as much power as possible away from Washington, DC. That way, I might not be able to get the entire country to adopt my preferred political views, but I at least have the option of moving to a state or city where the majority shares my values.

In any event, Anderson’s interview is definitely worth reading.

Luis Villa has an interesting post about the evolving understanding of open source software:

I’ve long thought that in open source software we are seeing a trend away from trust in an institution (think: Microsoft) and towards trust in ‘good luck’- i.e., in the statistical likelihood that if you fall, someone will catch you. In open source, this is most manifest in support- instead of calling a 1-800 # (where someone is guaranteed to help you, as long as you’re willing to be on hold for ages and pay sometimes very high charges), one emails a list, where no one is responsible for you, but yet a great percentage of the time, someone will answer anyway. There is no guarantee, but community practices evolve to make it statistically likely that help (or bug fixing, or whatever) will occur. The internet makes this possible- whereas in the past if you wanted free advice, you had to have a close friend with the right skills free time, you can now draw from a much broader pool of people. If that pool is large enough (and in software, it appears to be) then it is a statistical matter that one of them is likely to have both the right skills and the right amount of free time.

Clay Shirky today makes an argument that this isn’t just something that is occurring in open source, but is hitting other fields of expertise as well: “My belief is that Wikipedia’s success dramatizes instead a change in the nature of authority, moving from trust inhering in guarantees offered by institutions to probabilities created by processes.” Instead of referring to a known expert to get at knowledge, you can ask Wikipedia- which is the output of a dialectic process which may fail in specific instances but which Clay seems to suggest can be trusted more than any one institution’s processes in the long run.

This is an excellent point, but it’s actually not a new one. Two examples that immediately spring to mind are Darwin’s Origin of the Species and Friederich Hayek’s The Road to Serfdom (and, more specifically, his subsequent essay “The Use of Knowledge in Society” ). Darwin and Hayek each described decentralized processes in which the correctness of the result is produced by statistical processes, rather than by the good judgment of a trusted authority.

Continue reading →

Another Tech Exec Badmouths DRM

by on September 22, 2006

It’s interesting how people on the technology side of the media business tend to badmouth digital rights management technology even as they acquiesce to the content industry’s demands for it. We’ve seen how Steve Jobs bluntly admitted that DRM is not an effective piracy deterrent, just months before rolling out what became one of the world’s most widely deployed DRM schemes. And we’ve seen how Yahoo has pointed out to the labels that DRM does little more than inconvenience paying customers. Now Ashwin Navin, co-founder of the BitTorrent service, is badmouthing the concept even as his company implements it at the behest of Hollywood:

The reason it’s bad for content providers is because typically a DRM ties a user to one hardware platform, so if I buy my all my music on iTunes, I can’t take that content to another hardware environment or another operating platform. There are a certain number of consumers who will be turned off by that, especially people who fear that they may invest in a lot of purchases on one platform today and be frustrated later when they try to switch to another platform, and be turned off with the whole experience. Or some users might not invest in any new content today because they’re not sure if they want to have an iPod for the rest of their life.

Quite So. The people who pay for your content are not the enemy, and it’s counterproductive to create headaches for them.

Hat tip: Ars Technica

Bone-Headed Belgium Brouhaha

by on September 22, 2006 · 6 comments

Techdirt highlights an incredibly wrongheaded decision that was handed down this week by the Belgian courts:

In the ongoing case where a bunch of newspaper publishers are trying to force Google to pay them to index them and send them traffic (a move that has search engine optimizers worldwide wondering what they could possibly be thinking), Google appealed both parts of the ruling. The bigger issue (the indexing and showing links to Belgian certain news sources) will be heard on appeal in November. However, on the issue of forcing Google to place the entire text of the legal order on the front of both google.be and news.google.be, the Belgian courts have turned down Google’s appeal, and said they will start fining the company if it does not place the entire text (with no commentary, either) on both websites. This seems drastic and entirely unnecessary for a variety of reasons. All it really seems to do is broadcast the backwardness with which Belgian news publishers view the internet. It makes you wonder… do Belgian publishers require libraries to pay them extra money to list their books in a card catalog? What this really highlights, however, is that there are still plenty of industries out there that don’t necessarily understand how the internet works–and that can cause all sorts of problems for internet companies who assume most people understand when things are being done for their benefit.

The legal issues here are pretty well settled on this side of the Atlantic. Deep linking has been repeatedly upheld by American courts, and site administrators have several ways to remove their site from Google’s index and cache on request. The issue here is really about what the default should be: does Google have to get sites to opt-in to search engines, or do the sites have to opt out. If the courts were to uphold the former position, it would have a devastating impact on the search engine industry, because the logistics of getting opt-in permission from millions of individual site owners would likely be beyond the resources of all but the largest companies. If you want a stagnant search engine industry dominated by Microsoft, Google, and Yahoo, just set up copyright hurdles that will make it virtually impossible for new firms to enter the market.

Update: It’s been pointed out to me that I should make clear the distinction between law and policy here. I have no idea if the case was correctly decided as a matter of Belgian law, about which I know nothing. It’s quite possible that the Belgian courts decided this case correctly based on the laws on the books in Belgium. My point was simply that this decision is likely to have bad policy outcomes. I should have that more clear.

Sol Schildhause

by on September 22, 2006 · 3 comments

Supporters of free markets and free speech in communications lost a friend this past week with the passing of Sol Schildhause at the age of 89. While perhaps not well-known to many today, Sol was for decades a fixture in the world of cable TV, serving as the first head of the FCC’s cable bureau from 1966 to 1974–where he fought against rules that protected broadcasters from cable TV competition–and later as an attorney and chairman of the Media Institute, where he worked tirelessly for competition in cable TV itself. He was particularly instrumental in the effort to end exclusive cable franchising on the grounds that it was an unconstitutional violation of free speech. The Supreme Court decision that resulted from those efforts established that cable television firms’ were entitled to First Amendment protection, although it stopped short of banning exclusive local franchising.

Schildhause always seemed the maverick in his work, a happy warrior fighting against the status quo. This was evident even during his years at the FCC, where he was far from your typical bureaucrat. Sometimes this caused difficulties, as related by Tom Hazlett (now of George Mason University) in a 1998 article for Reason Magazine entitled “Busy Work”:

Continue reading →

Honestly, I don’t get it. How in the world does government lose so many laptop computers? I don’t know if you heard this yesterday but Sonoma County, CA authorities reported that they had lost one-time JonBenet Ramsey murder suspect John Mark Karr’s laptop, which supposedly contains evidence of child pornography that could have been used to help prosecute him. In other words, we basically bought this freak a free plane ride back from Thailand and then gave him a big “Get Out of Jail Free” card. Brilliant. How in the world do you lose the laptop of the guy who has been all over the news for the past month?

But wait, there’s more missing laptop news. In response to an inquiry from the House Committee on Government Reform, 17 federal agencies where asked to report any loss of computers holding sensitive personal information. The results, revealed yesterday, are staggering. According to Alan Sipress of The Washington Post: “More than 1,100 laptop computers have vanished from the Department of Commerce since 2001, including nearly 250 from the Census Bureau containing such personal information as names, incomes and Social Security numbers…” The Census Bureau’s lost laptops alone could have compromised the personal information of about 6,200 households. Apparently, according to MSNBC, “Fifteen handheld devices used to record survey data for testing processes in preparation for the 2010 Census also were lost, the [Census] department said.” (And you thought that the Census was accurate!) Other government departments reporting lost computers with personal information include the departments of Agriculture, Defense, Education, Energy, Health and Human Services and Transportation and the Federal Trade Commission.

Of course, all this comes on top of the lost laptop scandal over at the Department of Veterans Affairs this summer. One lost laptop contained unencrypted information on about 26.5 million people and another had information on about 38,000 hospital patients. And in August, the Department of Transportation revealed that a laptop containing roughly 133,000 drivers’ and pilots’ records (including Social Security numbers) had been stolen.

I honestly don’t understand how are government agencies and officials losing all these laptops but next time they tell us that we can trust them with personal information and other sensitive things I hope we all remember these incidents. This is outrageous.

Every week, I look at a software patent that’s been in the news. You can see previous installments in the series here. Before I get to this week’s patent, I wanted to note that the Public Patent Foundation has launched Software Patent Watch, a new blog that tracks the software patent problem. On Tuesday they announced that the patent office has broken the all-time record for software patents in a single year, and is on track to issue 40,000 patents by year’s end. That’s more than 100 software patents per day.

Luckily, none of those tens of thousands of patents produced any high-profile litigation this week, so I thought I’d cover one of the classics of recent software patent litigation, Microsoft’s (and now Apple’s) legal battle with Burst.com. Burst sued Microsoft back in 2002, claiming that Microsoft’s Windows Media software violates its patents. Microsoft settled the dispute last year, and Burst turned its legal guns on Apple in April, claiming that Apple stole the same “technology.”

Continue reading →