August 2008

I too am sad to see William Patry hanging up his spurs. I can sympathize with a lot of he says. I too consider myself a copyright centrist and a defender of copyright’s traditions and so find it frustrating to be forced by recent trends to be constantly on the “anti-copyright” side of every argument. However, I don’t share Patry’s depression regarding recent trends in the copyright world. Because while the legislative developments over the last 30 years have been an unbroken string of disasters, most other aspects of the copyright system have actually gone pretty well.

One ray of light is the courts, which continue to get more right than they get wrong. The courts have, for example tended to uphold the first sale doctrine and fair use against concerted challenges from the copyright industries. Had Congress not passed the 1976 Copyright Act, the NET Act in 1997, and the DMCA and CTEA in 1998, my sense is that we’d actually have a pretty balanced copyright system. This suggests to me that restoring copyright sanity wouldn’t actually be that hard, if Congress were ever inclined to do so. To a large extent, it would simply have to repeal the bad legislation enacted during the 1990s.

I can think of two reasons my outlook might be more optimistic than Patry’s. One is that I’m younger than he is. I graduated from high school in 1998, which was almost certainly the low point when it comes to copyright policy on the Hill. While advocates of balanced copyright haven’t passed any major legislative victories since then, they have blocked most of the bad ideas that have come down the pike. We killed Fritz Hollings godawful SSSCA, the broadcast flag, “analog hole” legislation, and so forth. Given the lopsided advantages of the copyright maximalist in terms of funding and lobbying muscle, holding our own isn’t bad.

I think another reason I might be less inclined to get depressed than Patry is that I’m not a copyright lawyer. One of the most important trends of the last couple of decades is a steady divergence between the letter of copyright law and peoples’ actual practice. At the same time copyright law has gotten more draconian, it has also grown less powerful. More and more people are simply ignoring copyright law and doing as they please. A few of them get caught and face draconian penalties, but the vast majority ignore the law without any real consequences.

I imagine this is depressing for a copyright lawyer to see an ever-growing chasm between the letter of the law and peoples’ actual behavior. The copyright lobby’s extremism is steadily making copyright law less relevant and pushing more and more people to simply ignore it. That’s depressing for someone who loves copyright law, but I’m not sure it’s so terrible for the rest of us. I would, of course, prefer to have a reasonable set of copyright laws that most people would respect and obey. But I’m not sure it’s such a terrible thing when people react to unreasonable laws by ignoring them. Eventually, Congress will notice that there’s little correspondence between what people are doing and what the law says they ought to be doing, and they’ll change the laws accordingly. I’d prefer that happen sooner rather than later, but I have little doubt that it will happen, and I’m not going to lose sleep over it in the interim.

A couple of years ago I plugged Jerry Brito’s spectrum commons paper. What I said in that post is still true:it’s a great paper that highlights the central challenge of the commons approach. Specifically, a commons will typically require a controller, that controller will almost always be the government, and there’s therefore a danger of re-introducing all the maladies that have traditionally afflicted command-and-control regulation of spectrum.

I’m re-reading the paper after having read the FCC’s spectrum task force report, and while I still agree with the general thrust of Jerry’s paper, I think he overstates his case in a few places. In particular:

Only if spectrum is first allocated for flexible use, with few if any conditions on its use, can a commons or a property rights regime help overcome the inefficiencies of command-and-control spectrum management. For example, if spectrum is allocated for flexible use, a property rights regime will allow the owner of spectrum to put it to the most valuable use or sell it to someone who will. Similarly, if there are no restrictions on use, a commons will allow anyone to use the spectrum however she sees fit, thus overcoming command-and-control misallocation.

However, while title to spectrum could theoretically be auctioned off in fee simple with no strings attached, a government-created and -managed commons will always have its usage rules set through a command-and-control process. Users of a government commons might not be explicitly restricted in the applications they can deploy over the spectrum, but they will have to comply with the sharing rules that govern the commons. Sharing rules, which will be established through regulation, will in turn limit the types and number of applications that can be deployed.

I think the difficulty here is that just as Benkler and Lessig over-idealize the commons by ignoring the inevitable role for government in setting standards, so this over-idealizes the spectrum property regime. It’s not true that spectrum “could theoretically be auctioned off in fee simple with no strings attached.” The key thing to remember here is that electromagnetic waves don’t respect boundaries established by the legal system. There will always be a need for technical rules to prevent interference between adjacent rights holders. If you hold a spectrum right in a geographic territory adjacent to mine, the government is going to have to have some rules about how much of your transmissions can “leak” onto my property before it counts as a trespass.
Continue reading →

I regret to report the end of William F. Patry’s Copyright Blog. Patry, author of a superb multi-volume treatise on copyright law and Google’s Senior Copyright Counsel, not only offered a feast of news and commentary for copyright geeks; he offered it up in style. Consider this, among the many sound reasons he cites for ending his blog:

Copyright law has abandoned its reason for being: to encourage learning and the creation of new works. Instead, its principal functions now are to preserve existing failed business models, to suppress new business models and technologies, and to obtain, if possible, enormous windfall profits from activity that not only causes no harm, but which is beneficial to copyright owners. Like Humpty-Dumpty, the copyright law we used to know can never be put back together again: multilateral and trade agreements have ensured that, and quite deliberately.

In short, Patry found blogging about copyright simply too depressing to keep up. I certainly understand that feeling, though I find righteous indignation a fair remedy for weary sadness. At any rate, I thank Patry for his long and selfless blogging, wish him happier diversions, and look forward to the day when we can discuss copyright’s reformation with smiling pride.

[Crossposted at Agoraphilia and Technology Liberation Front.]

As expected, the FCC has chosen Comcast as the target of its biggest net neutrality enforcement action to date.  I wonder whether the FCC has actually chosen a good set of facts to serve as the foundation for what may possibly be a broad new precedent (we won’t know how broad until the commission publishes the order), considering that the commission will likely be forced to defend it in court.  Like it or not, FCC decisions are required to have a “rational basis.”

FCC Chairman Kevin Martin suggests Comcast acted atrociously:

While Comcast claimed its intent was to manage congestion, they evidence told a different story:

  • Contrary to Comcast’s claims, they blocked customers who were using very little bandwidth simply because they were using a disfavored application;
  • Contrary to Comcast’s claims, they did not affect customers using an extraordinary amount of bandwidth even during periods of peak network congestion as long as he wasn’t using a disfavored application; 
  • Contrary to Comcast’s claims, they delayed and blocked customers using a disfavored application even when there was no network congestion;
  • Contrary to Comcast’s claims, the activity extended to regions much larger than where it claimed congestion occurred.

In short, they were not simply managing their network; they had arbitrarily picked an application and blocked their subscribers’ access to it

Yet Commissioner Robert McDowell seems to claim that the evidence is insubstantial:

The truth is, the FCC does not know what Comcast did or did not do. The evidence in the record is thin and conflicting.  All we have to rely on are the apparently unsigned declarations of three individuals representing the complainant’s view, some press reports, and the conflicting declaration of a Comcast employee. The rest of the record consists purely of differing opinions and conjecture. [footnote omitted]

Continue reading →

WASHINGTON, August 1 – The Federal Communication Commission’s enforcement action against Comcast can be seen either as a limited response to a company’s deceptive practices, or a sweeping new venture by the agency into regulating internet policy.

In ruling against Comcast on Friday, the agency ordered the company to “disclose the details of its discriminatory network management practices,” “submit a compliance plan” to end those practices by year-end, and “disclose to customers and the [FCC] the network management practices that will replace current practices.”

At issue in the decision was whether Comcast had engaged in “reasonable network management” practices when it delayed and effetively blocked access to users of BitTorrent, a peer-to-peer software program.

Although BitTorrent had already settled its complaints with Comcast, FCC Chairman Kevin Martin said that FCC action was necessary because the complaint had been brought by Free Press and Public Knowledge, two non-profit groups. The FCC did not impose a fine.

Martin said that he viewed the agency’s decision to punish the cable operator as a quasi-judicial matter: a “fact-intensive inquiry” against a specific company that it found to have “selectively block[ed]” peer-to-peer traffic.

[Continue reading “FCC Hammers Comcast For Deception and Unreasonable Internet Management“]

[A guest post from Tim Wu]

Well its always fun to have two people you respect read your work and such is the case with Tim and Adam, though to be honest I probably enjoyed Tim’s analysis a little more.

Adam’s reaction is too strong, and doesn’t really get at the main points in the op-ed. The main point was this: that bandwidth has become an essential input in an economy that depends heavily on moving information. For that reason we must gain a sensitivity to the issues of supply and demand surrounding it. If anyone disagrees with that, I’d love to hear why.

I use the comparison to gas and energy because we all know that when gas prices go up or down, large parts of the economy are affected, from tourism through, say, bowling alleys. What I am saying is that bandwidth may have a similar nature: that if prices are high, it effects all of the information-related markets in interesting ways, from startup video services through google. It is still early in the age of the internet economy, so this may be less obvious at this point.

If you agree with this, you must care about industry structure and government’s role in suppressing or helping competition in that market.

Meanwhile, while the OPEC example may be a tad dramatic, harping on the fact that OPEC is comprised of nation-states, as opposed to firms, is a mistake. From an economic perspective, why do we care if it is, say, a worldwide private conspiracy setting prices as opposed to a conspiracy of nation states? The effect on prices is the same whether its four firms setting food prices (like in the 1990s, with the Archer-Daniel Midlands price-setting cases), as opposed to four foreign governments. It is harder to stop the governments, because they rarely respond to lawsuits — but the economic consequences, so long as the price-fixing conspiracy lasts, is no different.

A point made in the comments is also true – which is that telecom tends to be in the realm of state-supported or regulated monopoly, and so there is some confusion as to whether what we are talking about are really private actors in a pure sense. This is a point Hayek made quite well. If government helps create a monopoly, as it has in cable and telephone markets – then being concerned about the consequences of that monopoly makes much sense.

Finally, on Tim Lee’s post – I take much less issue. I’d just like to point out that I am also an advocate of greater propertization as well as more dedication to the commons—its the stuff in the middle I don’t care for. For example, as Tim knows, I would like to see the development of ways for people to own their own fiber connections (homes with Tails). I also believe that, in broad spectrum reform, there should be more propertization of the airwaves. The only silly position, it seems to me, is to maintain on principle that either a commons or private property is of no use whatsoever.

Web Pro News’ Jason Lee Miller seems to think he’s hoisted my colleague Bret Swanson, and The Progress & Freedom Foundation in general, on our own collective  petard.  Bret had responded to Tim Wu’s NYT op-ed by questioning Wu’s argument for developing “alternative supplies of bandwidth” to free us from the tyranny of the OPEC-like broadband cartel:

Unlike natural resources such as oil, which, while abundant, are at some point finite, bandwidth is potentially infinite. The miraculous microcosmic spectrum reuse capabilities of optical fiber and even wireless radiation improve at a rate far faster than any of our macrocosmic machines and minerals. It is far more efficient to move electrons than atoms, and yet more efficient to move photons. Left unfettered, these technologies will continue delivering bandwidth abundance.

Miller suggests that this response to Wu destroys arguments Bret and others at PFF have made against net neutrality regulation–a crusade led by Wu (who taught me Internet law, as it happens):

So what [Swanson is] saying is bandwidth scarcity is a notion invented by internet service providers and wireless providers to jack up prices and provide excuses for interfering with competing services on their networks. Nice. In a weird way, Swanson focuses so hard on disproving Wu’s analogy one way, he misses how the analogy is proved in another: a few organizations (government or not) controlling an important resource and forcing artificial scarcity in order to control the market for that resource is called a cartel.

Miller’s “Gotcha!” rests on the seemingly undeniable premise that broadband can’t be both abundant (as Bret argues) and scarce (such that ISPs must management traffic on their networks, however non-neutral that may be).   But in fact, this seeming contradiction is inherent in the very nature of the Internet–and the way Internet access is currently priced. Continue reading →

…to cover the hearing at which Comcast is expected to be punished for violations of Network Neutrality. Fortunately, the Federal Communications Commission did not start on time. The great thing about the Kevin Martin FCC is that you never have to worry about being late. For example, we’re live at the FCC for the 9:30 a.m. meeting:

The FCC, 9:49 a.m.

The FCC, 9:49 a.m.

I’ll be live-Twittering the event, so check back on DrewClark.com (look at the column on the right – or just go to Twitter and “follow” me) for the latest updates. Later in the day, I’ll be posting a story about the event at BroadbandCensus.com.