I recently completed a draft of Copyright as Intellectual Property Privilege, 58 Syracuse L. Rev. __ (2007) (forthcoming) (invited). Here’s an abstract:

We often call copyright a species of intellectual property, abbreviating it, “IP.” This brief paper suggests that we consider copyright as another sort of IP: an intellectual privilege. Though copyright doubtless has some property-like attributes, it more closely resembles a special statutory benefit than it does a right, general in nature and grounded in common law, deserving the title of “property.” To call copyright a “privilege” accurately reflects legal and popular usage, past and present. It moreover offers salutary policy results, protecting property’s good name and rebalancing the public choice pressures that drive copyright policy. We face a choice between two ways of thinking about, and talking about, copyright: As an intellectual property that authors and their assigns own, or as an intellectual privilege that they merely hold. Perhaps no label can fully capture the unique and protean nature of copyright. Recognizing it as form of intellectual privilege would, however, help to keep copyright within its proper legal limits.

Continue reading →

Brad Stone of the New York Times has a good post on the Bits Blog regarding the Comcast kerfuffle (Jim, Why are we calling it that, again?). The gist:

It seems unlikely that Comcast has a secret agenda to shut down file-sharing applications and combat piracy on its network. But the company is clearly trying to have it both ways. It claims it is a neutral Internet service provider that treats all packets equally, not blocking or “shaping” its Internet traffic. Meanwhile it also positions itself as the champion of average Internet users whose speeds are being slowed by file-sharing.

The problem Comcast may now be facing is that in the absence of a plain explanation about what the company does to disadvantage certain applications in the name of managing traffic on its network, anecdotal reports and conspiracy theories fill the vacuum.

I have no doubt that Comcast’s practices stem from trying to provide a good, quality service for the majority of their customers. The problem their actions pose for those of us who advocate against unnecessary regulation, however, is that they’re not being completely clear about what they’re doing (although they’re trying).

For example, if the problem is one percent of users who tend to be bandwidth hogs, why not address the users instead of a protocol? AOL dial-up and T-Mobile wireless are able to meter customer use above a certain allotment without any negative privacy implications. It seems like Comcast does in fact target bandwidth hogs, although it doesn’t publish what the limit is. These sort of unknowns stir up the conspiracy theories Stone mentions. That makes explaining to folks that there’s nothing nefarious here pretty tough.

If you’d like to get a flavor for the sort of impact that one (tenacious) citizen can have on making government data more transparent, check out this Google Tech Talk by one of my personal heroes, Carl Malamud. (I write about his exploits in my new paper on online transparency.) He talks about cajoling the ITU to put standards online, forcing the SEC to put its public information online, and his new project to live-stream and archive video of all congressional and agency hearings in Washington. He’s a real inspiration.

Comcast was kind enough to invite me to a conference call between one of their engineers and some think tank folks. They feel their policies have been mischaracterized in the press. While I found some of the information they shared helpful, I frankly don’t think they helped their case very much.

While he didn’t say so explicitly, the Comcast guy seemed to implicitly concede that the basic allegations are true. He emphasized that they were not blocking any traffic, but that in high-congestion situations they did “delay” peer-to-peer traffic to ease the load. Apparently the Lotus Notes thing was a bug that they’re working to fix. He refused to go into much detail about exactly how this “delay” was accomplished, but presumably if the AP’s story about TCP resets were inaccurate, he would have said so.

To be fair, most of the people on the call were lawyers or economists, not technologists, so it’s possible he just didn’t think anyone other than me would care about these details. Still, it seems like part of the the point of having an engineer on the call would be to answer engineering-type questions. He also made a couple of points that I found a little patronizing. For example, he emphasized that most users wouldn’t even be able to detect the traffic-shaping activities they use without special equipment and training. Which is true, I guess, but rather beside the point.

If you haven’t read it yet, I recommend the discussion in response to Jerry’s post. I don’t know enough about the internals of cable modem protocols to know for sure who’s right, but Tom seems to me to make a good point when he says that forging reset packets is a wasteful and disruptive way to accomplish traffic shaping. The TCP/IP protocol stack is layered for a reason, and I can’t see any reason for routers to be mucking around at the TCP layer, when throttling can perfectly well be accomplished in a protocol-neutral manner at the IP layer.

Someone asked why Comcast didn’t throttle on a user-by-user basis rather than a protocol-by-protocol basis, and he said they were concerned with the privacy implications of that approach. That doesn’t make a lot of sense to me. Very few users are going to consider the number of bits they’ve transferred in a given time period to be confidential information.

We also asked about why there wasn’t more transparency about what throttling methods were being used and against which protocols. Apparently, Comcast feels that disclosing those sorts of details will make it easier for users to circumvent their throttling efforts. That doesn’t strike me as terribly persuasive; customers are entitled to know what they’re getting for their money, and people are going to figure it out sooner or later anyway. All secrecy accomplishes is to make them look bad when someone discovers it and reports it to the press.

With all that said, I’m not sure I see an obvious policy response. It seems to me that regardless of what the law says, there’s always going to be a certain amount of cat-and-mouse between ISPs and the heaviest network users. As Don Marti has pointed out, workarounds are easy to find. Add in a healthy dose of negative publicity, and it seems to me that while Comcast’s behavior is far from laudable, it’s far from obvious it’s a serious enough problem to justify giving the FCC the opportunity to second-guess every ISP’s routing policies.

I’ve been laboring for a few months on a paper about government transparency on the internet and I’m happy to say that it’s now available as a working paper. In it I show that a lot of government information that is supposed to be publicly available is only nominally available because it’s not online. When data does make it online it’s often useless; it’s as if the .gov domain has a prohibition on XML and reliable searches.

First I look at independent third parties (such as GovTrack.us) that are doing yeoman’s work by picking up the slack where government fails and making data available online in flexible formats. Then I look at yet other third parties who are taking the liberated data and using them in mashups (such as MAPLight.org) and crowdsourcing (such as our own Jim Harper’s WashingtonWatch.com). Mashups of government data help highlight otherwise hidden connections and crowdsourcing makes light work of sifting through mountains of data. If I may corrupt Eric Raymond’s Linus’s Law for a moment, “Given enough eyeballs, all corruption is shallow.” In the coming days I plan to write a bunch more on how online tools can shed light on government, including a series dissecting the FCC’s website–not for the squeamish.

I believe opening up government to online scrutiny is immensely important. If we’re going to hold government accountable for its actions, we need to know what those actions are. The Sunlight Foundation has been doing fantastic work on this front and I would encourage you to visit them and especially their Open House Project blog. I would also encourage you to send me any comments you might have on my paper as I’m still perfecting it before I submit it to journals.

From the WSJ, for subscribers, Holman Jenkins rather cynical take on net neutrality and Google, “Sort of Evil.”

On the policies that contributed to the rise of cell phones in Africa–and the resulting reduction in poverty.

The Financial Times has an interesting email discussion between Richard Epstein and Harry First on the merits of antitrust actions against Microsoft.

As Jerry wrote up briefly over the weekend, Comcast is alleged to have been “shaping” traffic over its network. Proponents of broadband regulation have already gotten a bit conclusory, even triumphal, expecting that this makes the case for public utility regulation of broadband service.

But I expect that we’ll soon learn more about the situation, and the conclusions to be drawn from it will be less obvious. There might be legitimate security reasons for what Comcast has done. We’ll see. We should expect full disclosure from Comcast.

My take: If Comcast is “shaping” traffic inconsistent with their terms of service, for non-network-security reasons such as copyright protection or surreptitious usage control, they shouldn’t be doing that.

More important is the meta-point: Independent testers found what they believe to be an impropriety in Comcast’s provision of broadband. They called it out, and interested parties among advocacy organizations and the media swarmed all over it. Comcast has to answer the charge, whether meritorious or not.

These are market processes working their will, and the outcome will be reached in short order – whether Comcast backs away from an improper practice, whether we learn that Comcast was not acting badly, or whether Comcast amends its terms to reflect what it thinks serves customers best.

This doesn’t conclude the discussion of whether there should be regulation. It allows us to refine the discussion: The proponents of regulation should now be challenged to write the regulation that would suss out this kind of (still alleged) misbehavior, distinguish it from appropriate network management, and ban it – without wrapping provision of Internet service in red tape or creating regulatory capture that suppresses competition. Good luck with that!

Obviously, more to come.

The Financial Times posted an article this week about the ongoing push by state attorneys general to impose age verification regulation on social networking sites and followed it up with an outstanding editorial entitled “Out of MySpace.” They note:

Age verification… just will not work. The practical problems are considerable. Fourteen-year-olds do not have drivers’ licences and credit cards that can be checked via established agencies. The sites could insist on verifying the parents, but anyone who believes that a teenager will not “borrow” his father’s Visa has never been 14 years old.

The consequences of successful age verification, meanwhile, would be even worse. Minors would be driven off mainstream sites such as MySpace and Facebook and on to unaccountable offshore alternatives or the chaos of newsgroups and minor bulletin boards. There they would be far more vulnerable than on MySpace, which now makes efforts to keep tabs on its users.

That’s exactly right and it very much follows what I have found in my own research. If you’re interested, check out my paper “Social Networking and Age Verification: Many Hard Questions; No Easy Solutions,” as well the transcript of an event I hosted in March on “Age Verification for Social Networking Sites: Is it Possible? Is it Desirable?”

As I wrote about here, the last big showdown in the states took place in North Carolina in July. But it won’t be the last.