I’ve been neglecting my blogging over the past week since my free time has been occupied with setting up my latest high-tech toy–Sony’s “Location Free TV.” To keep pace with the increasingly popular Slingbox, which also allows consumers to space- or place-shift their TV and other video signals, Sony has just released a new box (the LP-20) that retails for $250 bucks and has more features than their first generation Location Free boxes. As I was setting it up and troubleshooting various connection problems (and I had quite a few), I kept wondering about whether or not this new Sony device would raise any copyright issues.

Like the Sling, Sony’s Location Free box allows you to watch your home TV signals on your personal computer anywhere you want via an Internet connection. An added bonus with the Sony box is the ability to also watch TV remotely on your PlayStation Portable (PSP) gaming device. It’s a very cool feature but my experience with it so far has been a mixed bag. The PSP suffers from more latency issues (probably due to its more limited wireless networking capabilities) and picture quality really becomes unbearable at times as a result.

But watching TV remotely on my laptop looks pretty good and the desktop software that Sony provides makes it very easy to program in my cable set-top box codes and special buttons (like the button I use to call up my PVR archive so I can watch recorded TV shows while I’m on the road). And I can also use the Location Free box to control another video source, such as my DVD player. So, when I’m stuck in an airport trying to keep my kids from melting down, I can remotely access an animated movie sitting in my DVD tray back home. Very, very cool.

Continue reading →

Samba Blasts Novell

by on November 13, 2006 · 16 comments

The Samba team is not happy about the Microsoft-Novell deal:

One of the fundamental differences between the proprietary software world and the free software world is that the proprietary software world divides users by forcing them to agree to coercive licensing agreements which restrict their rights to share with each other, whereas the free software world encourages users to unite and share the benefits of the software.

The patent agreement struck between Novell and Microsoft is a divisive agreement. It deals with users and creators of free software differently depending on their “commercial” versus “non-commercial” status, and deals with them differently depending on whether they obtained their free software directly from Novell or from someone else.

The goals of the Free Software community and the GNU GPL allow for no such distinctions.

Furthermore, the GPL makes it clear that all distributors of GPL’d software must stand together in the fight against software patents. Only by standing together do we stand a chance of defending against the peril represented by software patents. With this agreement Novell is attempting to destroy that unified defense, exchanging the long term interests of the entire Free Software community for a short term advantage for Novell over their competitors.

The GPL, at its heart, is about reciprocity: you’re permitted to distribute the software, without restrictions, provided you respect the equal freedom of others to do the same. Although Novell itself hasn’t done anything to directly restrict users’ freedom under the GPL, this agreement is clearly a step in the direction of making non-Novell users of GPL’ed software second class citizens under patent law.

It’s debatable whether the Microsoft-Novell agreement violates the letter of the GPL, and it seems unlikely that anyone will be able to prevail against Novell in court. But I think it’s pretty clear that Novell’s actions violate the spirit of the GPL. It will be interesting to see if the free software community is able to effectively punish Novell through ostracism.

Brooke Oberwetter and I have been having an interesting discussion here and here about network neutrality. I want to start by emphasizing that I wholeheartedly agree with Brooke’s broad point that technologies change over time, and so we should be skeptical of proposals to artificially restrain that evolution by giving government bureaucrats the power to second-guess the design decisions of network engineers. Doubtless the Internet will evolve in ways that neither of us can predict today, and we don’t want the law to stand in the way.

But Brooke went beyond that general point and offered some specific examples of ways she thinks the Internet might change. Her main contention seems to be that the end-to-end principle is overrated, and that “the only reason they’re so revered is because they are simply what is.” I think this is fundamentally mistaken: people have good reasons for revering the end-to-end principle, and abandoning it would be a very bad idea. I’ll discuss some of her specific arguments below the fold.

Continue reading →

I’m trying to wrap my head around the scenario I criticized earlier today, where a broadband ISP charges individual sites for faster speeds.

Let’s suppose, for the sake of argument, that Comcast imposes a half-second delay into the loading of any website that doesn’t pay a special high-speed access fee. The fee might be $1 for every 100,000 page views. This website gets roughly 100,000 page views per month, so we’d owe about $1/month to Comcast if we wanted to avoid having our site load slowly for Comcast customers. A site like Techdirt, which gets roughly 100 times as much traffic as we do, would owe Comcast about $100/month if it didn’t want its traffic slowed. Google, which gets 100,000 times as much traffic as us, would have to pay about $100,000 per month. Clearly, such a scheme could bring in tens of millions of dollars in additional revenue each year.

Of course, it would be ridiculous for us to send Comcast a $1 check each month. Especially since we would presumably be expected to do the same thing with Verizon, AT&T, Charter, Sprint, Qwest, and dozens of smaller ISPs. Running a “high speed” web site would require writing dozens of checks to dozens of different network owners.

Continue reading →

Overblown

by on November 10, 2006 · 2 comments

Gene Healy reports that John Mueller’s new book, Overblown: How Politicians and the Terrorism Industry Inflate National Security Threats, and Why We Believe Them, is out. I can’t wait to read it. Unfortunately, it will probably make the indignities of airport security even more depressing.

I’ve never understood the point of making you turn off electronic devices on airplanes. I mean, if there are electronic devices out there that interfere with onboard navigation on planes, shouldn’t we be doing something more aggressive about it than simply asking nicely?

Last week, my friend Brooke was kind enough to link approvingly to my post on the phantom threat of network discrimination. Brooke agrees with me that those who think “Verizon is just itching for the opportunity to detect and block every packet of data it carries that mentions the Second Amendment” are nuts.

She goes on to offer an example of a case where network discrimination would be beneficial:

One fear, however, didn’t make Tim’s list; it’s the fear that the ISPs will do exactly what we think they’ll do, which is to introduce tiered pricing for content delivery…

Suppose some new tech-tinkering über-geeks come up with a search engine even better than Google. Because they lack brand recognition, they need to keep expenses at a minimum while word of mouth slowly spreads about their better quality. In net neutrality America, they cannot keep expenses down by opting for lower quality delivery than that offered by Google. Delivery speed is not a viable option for competition; everyone has to ship at the $11 rate. Now imagine that one of the über-geeks is a trust fund baby. He’s so sure that his product is superior, he invests his trust fund in über-geeks, Inc. so they can buy higher speed delivery than Google offers, thus giving Google a serious competitive run for its money. Sadly this too is not an option in net neutrality land.

Prices and price flexibility are essential to competition. The fear that content competition will suffer without regulation is absurd on its face. Indeed, net neutrality regulation will rob new innovators and content creators of the very tools that would make challenging already established businesses possible. It’s little wonder then that the already established businesses–like Amazon, E-bay, and Google, to name a few–are fighting for net neutrality tooth and nail.

Continue reading →

This week I’m going to consider NTP’s patents on wireless email. Fresh from its settlement with Research in Motion (makers of the BlackBerry), NTP has sued Palm on Monday over the same patents:

Apparently, the $612.5 million that patent holding firm NTP got out of RIM for its questionable patents wasn’t enough. The company (really, a group of lawyers) has filed a lawsuit against Palm as well. Apparently, the firm is claiming patent violations on the same five patents it used against RIM, as well as two additional ones. However, considering that the US Patent Office has given final rejections to two of the patents in the RIM case and indicated it’s likely to reject the rest, it would seem like NTP doesn’t have much of a leg to stand on. It’s unclear what the other two patents are, though they could be from some new deals NTP has cooked up to get its hands on more patents for the sole purpose of squeezing money out of companies. As for the rejected patents, NTP has indicated that it will appeal the patent rejections–so perhaps they hope to cause enough trouble for Palm while they drag out the process that it’s forced to settle as well.

This is a horrible misuse of the patent system, and is simply taking hundreds of millions of dollars away from what should be a developing market and putting it in the hands of a bunch of greedy lawyers who have done nothing to help move the technology forward in the market place. If you don’t recall, NTP was a holding company that owned some disputed exceptionally broad patents on a concept that was basically “wireless email.” An earlier company had tried to do something with the patents, but failed in the marketplace. RIM came along and successfully innovated in the marketplace (while being a bit of a patent menace itself), and suddenly NTP claimed that no one could do wireless email without paying them for the privilege. The patents were incredibly broad and perfectly obvious and never should have been granted (something the USPTO later would admit in rejecting them). Yet, due to the increasing uncertainty over the lawsuit, and the pressure that put on RIM’s stock, the company was forced to settle, taking money away from R&D efforts and sales and handing it over to the lawyers at NTP so they could turn around and sue more companies that were actually successfully innovating and building products and services people wanted.

Mike’s analysis is exactly right. Here is the original patent. It’s important to emphasize here that there was never any allegation that RIM or Palm ever copied anything from NTP or its anyone else associated with these patents. By all accounts, RIM and Palm developed their products completely independently. But that’s irrelevant in patent law: once one company “invents” something–even something as broad as “Hey, maybe we could transmit emails wirelessly!”–and gets a patent for it, no one else is allowed to build that invention without permission from the patent holder.

That’s clearly absurd in a case like this, where the scope of the patent is so broad as to encompass an entire industry. Yet despite the evident absurdity of these patents, and despite the fact that the patent office is now scrambling to correct its mistakes, NTP is still able to extort hundreds of millions of dollars from other companies. And, as Mike points out, it’s truly perverse that our patent system is transferring hundreds of millions of dollars from innovative companies to a pack of greedy lawyers who have never developed a useful product in their lives.

I almost choked on my morning coffee when I saw the headline last week that Novell and Microsoft announced a deal to make their software work together. As someone who once employed VMware to use Word on a machine running Linux OS, I have to say that I was both surprised and thrilled. And, as someone who closely followed the Microsoft antitrust cases in both the US and Europe, I was astounded. I wish I could call Judge Jackson right now and ask him why he thinks these two competitors who once looked to be arch enemies are now joining forces (Novell accused MS of antitrust violations and sued over WordPerfect). But of course Jackson didn’t think Microsoft had any competitors, so perhaps he wouldn’t really understand the question.

The fact that Microsoft and Novell are now teaming up to provide consumers with something they have been clamoring for (interoperability) is proof that the marketplace can deliver benefits to consumers without government help even if the two competitors have a bad history.

Techdirt points out an especially serious example of e-voting gone wrong:

In one of the stories we spotted yesterday about e-voting glitches, it was amusing to see (at the very, very bottom) the idea that “no major problems” were reported for e-voting in Florida. Florida and Ohio, of course, are the two places where e-voting stories have raised the most questions, and there had already been a number of reports of e-voting problems in Florida voting last week when their early polls opened. So, it looks like ABC may need to revise that “no major problems” report, as the EFF points us to a report saying that 13% of the electronic responses in Sarasota County included no vote for Congressional Representative. That means that somewhere between 8,000 to 10,000 people who voted for other things, like governor, appear to have not voted for House Representative–and no one seems to have a good explanation. It’s certainly possible that all those people decided to go “none of the above,” but it seems unlikely–especially since similar undervoting was not seen in other counties covered by the same Congressional district. Also, there were complaints all day about the e-voting machines not properly recording votes in that county. So, while people are asking for a recount… there’s nothing to recount since the machines did not record the votes. Amusingly, the EFF also notes that the very same county had a referendum on the ballot about the e-voting machines, and the people overwhelmingly voted to scrap the machines and bring back paper ballots. So what was it the press was just saying about no major glitches with e-voting?

One of the things that makes computers incredibly useful is that automate routine tasks so they can be done without human supervision. That’s fantastic for most tasks, but it’s a disaster when the task at hand is recording votes, because it means that if there’s a programming bug, it will do things the same wrong way with each and every voter. And because the counting process is totally opaque, no one notices until it’s too late.

E-voting machines may streamline the voting process, but that’s actually not a benefit at all. A slow, labor-intensive voting process means there will be more human eyes around to spot mistakes early enough that they can be corrected. But because we delegated the process to a computer, there were no human beings in a position to notice the problem.