June 2008

 As Mark Twain might have said if he followed spectrum policy: the reports of the death of central planning in Washington have been greatly exaggerated.  As early as next week, the Federal Communications Commission may vote on a plan mooted by Chairman Kevin Martin to auction off 25 MHz of spectrum to the highest bidder, with a catch:  reportedly, the licensee will have to use the spectrum to offer free broadband service, with a network to be built out on a timetable specified by the FCC, and with content that doesn’t offend the regulators in Washington.

For most of the last century, the FCC was in the business of defining how spectrum would be used – deciding not only who would get it, but what services they would use it for, and under what conditions.  Over the past 20 years, however, the idea of central planning has fallen into disrepute, not only internationally (see Soviet Union) but in Washington.   The idea that five individuals in Washington – no matter how intelligent and well-dressed – could know the proper uses and methods of providing wireless services for millions of people became increasingly hard to defend.  After a disastrous start to the cell phone era – which was delayed by a decade or more do to FCC delays, the central planning model was largely replaced by markets, with licenses assigned by open auction, and (more importantly) uses and business models defined by consumer demand.

By the turn of the 21st century, the FCC’s planning colossus had been effectively toppled, with some of the biggest tugs at the ropes by the Clinton era-FCC.  The rest, as they say, is economic history.   Over the last 20 years, the number of Americans with wireless has grown from two to 255 million, while the devices they use have transmogrified from brick phones to multipurpose units that do everything but the user’s laundry.

 Chairman Martin’s proposal would take a giant leap backward from this marketplace success.  It isn’t the first attempt by the present FCC to try to direct spectrum use.   In auctions earlier this year of former analog television spectrum, the FCC set aside blocks for “open access” uses, and for spectrum to be used in partnership with public safety users.   The auctions were remarkably unsuccessful – with the one block failing to meet its reserve price, and the other fetching far less than similar, unencumbered, spectrum. Continue reading →

Don’t miss the discussion between Debbie Rose and TLF’s Cord Blomquist about the DMCA safe harbor. Despite her long experience with the DMCA, Debbie takes what strikes me as an implausible position:

While I could go on for pages about what is wrong with your post, I’ll confine my comments to this: the DMCA does NOT give websites hosting user-generated content a safe-harbor.

The safe-harbor provision is for service providers- in other words, the network operators or owners of the “pipes.” As I wrote in a post last March, , this provision was the result of a long and difficult negotiation. As one of the House Judiciary counsels involved in the negotiations, I can assure you that websites such as YouTube were NOT intended to be included in the safeharbor.

Cord does a good job of citing chapter and verse from the DMCA, so I won’t belabor that point further. However, let me observe that the position Debbie is staking out here doesn’t even seem to me to be coherent. The DMCA’s safe harbor relates to “storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider.” Now, strictly speaking, nothing “resides” in the Internet’s “pipes.” If this language was intended to be limited to network owners, rather than the operators of servers, you have two hard questions to answer: first, why does the language say “system or network” rather than simply “network?” And second, why does it require the service provider to “remove, or disable access to” the content rather than simply requiring that access be disabled? After all, you can only remove information from a server if you operate the server, and for the most part the servers tend to be operated by someone other than the network owner.

I suppose you could argue that this provision applies only to servers that are operated by ISPs. But that doesn’t make a lot of sense, for two reasons. First, there would be no principled reason to provide a different level of immunity to web hosting services that own their own pipes to the backbone than to web hosting services that rent their pipes from someone else. And more fundamentally, you have the question of defining who counts as an ISP. After all, every web hosting service of non-trivial size administers a network. Certainly Google administers a sizable network connecting all of its servers. So why wouldn’t Google be able to claim the safe harbor as an ISP?

Debbie’s argument also runs counter to common usage. A quick survey of the takedown notices at Chilling Effects makes it clear that there are a ton of people out there whose copyright lawyers regard websites like Digg, Google, Wikipedia, and Gawker as proper targets for a DMCA takedown notice. Now, I suppose it’s possible that all of these copyright lawyers are incompetent, and that they should have filed copyright infringement lawsuits instead. But I tend to doubt it. And at a minimum, if all of these copyright lawyers are confused about the DMCA, there’s a pretty good chance that the judge in the Viacom lawsuit will share their delusion.

Over at Ars, I’ve got a story up about a ruling on fair use in the creationist Intelligent Design movie Expelled:

Imagine There\'s No Fair use

The controversy centers around a segment about an hour into the film. Science advocate PZ Myers argues that greater science literacy would “lead to the erosion of religion,” and expresses the hope that religion would “slowly fade away.” The narrator, Ben Stein, asserts that Myers’ ideas aren’t original. Rather, he is “merely lifting a page out of John Lennon’s songbook.”

The viewer is then treated to a clip from John Lennon’s “Imagine,” with the lyrics “Nothing to kill or die for/And no religion too.” The music is accompanied by black-and-white footage “of a military parade, which gives way to a close up of Joseph Stalin waving.” Next, the film cuts to a guest who argues that there is a connection between “transcendental values” and “what human beings permit themselves to do one to the other.” Evidently, religion is the only thing standing between us and Stalinist dictatorship.

Judge Stein’s task wasn’t to critique the dubious logic of this segment, but to evaluate the narrower question of whether the film’s use of “Imagine” is fair under copyright law. He noted that the film was focused on a subject of public interest, and that the film was commenting on Lennon’s anti-religious message. The excerpting of copyrighted works for purpose of “comment and criticism” is explicitly protected by the Copyright Act, and Judge Stein ruled that this provision applied in this case.

It’s worth keeping in mind that no competent lawyer would have taken Ono’s case if we were talking about a quote from one of Lennon’s books rather than a clip from his song. But there’s no logical difference between the two. The music clip in this case is playing precisely the same role in this movie as a blockquote plays in the average blog post. Moreover, the dozen or so words of the “Imagine” quote is much shorter than most blockquotes. I conclude:

It is unfortunate that Lennon’s heirs sought to use copyright law to squelch criticism of Lennon’s lyrics. No matter how dishonest Stein and company’s arguments may be, they have the right to make them, and copyright must give way to the First Amendment. Ono’s aggressive tactics will give Stein and company an undeserved PR victory, allowing them to play the beleaguered underdogs fighting the “Darwinist” establishment. The way to counter Expelled is with logic and evidence, of which there’s an ample supply. Overzealous application of copyright law is counterproductive.

I haven’t had time to read it yet, but Princeton’s IT Policy Center has a new paper out about open file formats and government transparency that’s worth checking out:

If the next Presidential administration really wants to embrace the potential of Internet-enabled government transparency, it should follow a counter-intuitive but ultimately compelling strategy: reduce the federal role in presenting important government information to citizens. Today, government bodies consider their own websites to be a higher priority than technical infrastructures that open up their data for others to use. We argue that this understanding is a mistake. It would be preferable for government to understand providing reusable data, rather than providing websites, as the core of its online publishing responsibility.

In the current Presidential cycle, all three candidates have indicated that their think the federal government could make better use of the Internet. Barack Obama’s platform explicitly endorses “making government data available online in universally accessible formats.” Hillary Clinton, meanwhile, remarked that she wants to see much more government information online. John McCain, although expressing excitement about the Internet, has allowed that he would like to delegate the issue, possible to a vice-president.

But the situation to which these candidates are responding — the wide gap between the exciting uses of Internet technology by private parties, on the one hand, and the government’s lagging technical infrastructure on the other — is not new. The federal government has shown itself consistently unable to keep pace with the fast-evolving power of the Internet.

In order for public data to benefit from the same innovation and dynamism that characterize private parties’ use of the Internet, the federal government must reimagine its role as an information provider. Rather than struggling, as it currently does, to design sites that meet each end-user need, it should focus on creating a simple, reliable and publicly accessible infrastructure that “exposes” the underlying data. Private actors, either nonprofit or commercial, are better suited to deliver government information to citizens and can constantly create and reshape the tools individuals use to find and leverage public data. The best way to ensure that the government allows private parties to compete on equal terms in the provision of government data is to require that federal websites themselves use the same open systems for accessing the underlying data as they make available to the public at large.

The paper cites our own Jerry Brito, who has done some great work in the same vein, in several places.

WASHINGTON, June 2 – Ensuring that all Americans have access to broadband is about more than ensuring high-speed Internet connectivity, said the CEO of the One Economy, a non-profit organization promoting a philosophy of “digital inclusion.”

In addition to ensuring that broadband is present, affordable and available for adoption by low-income Americans, groups aiming to make a difference in stemming the digital divide must also focusing on human capital and digital media content, said Rey Ramsey of One Economy, speaking last week at plenary session the International Summit for Community Wireless Networks here.

Continue reading Digital Inclusion About More Than Connectivity […]