Oh dear. LOLcat translates “The Wastland.

Bill Rosenblatt reports on EFF’s and other’s support for fair use in filtering. My take: A constructive step. It remains to be seen whether an objective “fair use” standard can be developed; and then again, whether it can be technologically implemented; if not, the answer will be a combination of process for “appeals” and simple licensing mechanisms.

In particular, this is a welcome departure from the “filtering is useless” stance. Certainly, filtering can be defeated. But ultimately something posted for public consumption must be in the clear. And not everyone will encrypt, especially if they are unaware that they are infringing. By and large, it ought to be possible to get copyright filtering for entire works to work at least as well as spam filtering–that is, not perfectly, but enough to get a handle on the problem.

In today’s New York TImes, John Ashcroft jumps on the bandwagon for giving telcos blanket immunity for their participation in illegal wiretapping programs:

At the outset, it is critical to understand what the immunity provisions the administration and Congress have negotiated actually do. This is not “blanket immunity,” as it is sometimes caricatured by its opponents. The Senate bill would confer immunity in only two limited circumstances: if the carrier did not do what the plaintiffs claim; or if the carrier did do what the plaintiffs claim but based on explicit assurances from the highest levels of the government that the activities in question were authorized by the president and determined to be lawful.

Longstanding principles of law hold that an American corporation is entitled to rely on assurances of legality from officials responsible for government activities. The public officials in question might be right or wrong about the advisability or legality of what they are doing, but it is their responsibility, not the company’s, to deal with the consequences if they are wrong.

To deny immunity under these circumstances would be extraordinarily unfair to any cooperating carriers. By what principle of justice should anyone face potentially ruinous liability for cooperating with intelligence activities that are authorized by the president and whose legality has been reviewed and approved by our most senior legal officials?

A couple of points immediately spring to mind here. In the first place, if “longstanding principles of law” tell us that the telcos are “entitled to rely on assurances of legality from officials responsible for government activities,” then why is new legislation necessary? Why can’t AT&T simply invoke those principles in court and get the lawsuits dismissed without Congress having to get involved?

Second, the claim that this is not “blanket immunity” is absurd. Obviously, AT&T and Verizon aren’t going to hand over customer data the executive branch hasn’t asked for. And the executive branch would never admit that its information requests were unlawful. So granting immunity for any requests the executive branch says are lawful means granting immunity for any conceivable information request. That’s blanket immunity; there’s nothing “limited” about it.

Third, the “principle of justice” Ashcroft is looking for here is the warrant requirement of the Fourth Amendment. The fundamental principle of the Fourth Amendment is that the judicial branch, not the executive branch, gets to decide when a search is “authorized.” No matter how many executive branch officials “review and approve” a search, the search isn’t constitutional unless it’s approved by a judge.

But actually, if I were in Congress I would be willing to call Ashcroft’s bluff. I’d support immunity legislation on the condition that the president appoint a special prosecutor that would commence a top-to-bottom review of all the wiretapping programs the White House has undertaken, and bring criminal charges against the relevant administration officials (including, ahem, Ashcroft himself) if he finds that any of them ran afoul of the law. Of course, the White House would never consent to that. Because they don’t really believe that executive branch officials should “deal with the consequences” of the decisions they make. To the contrary, I suspect one reason the White House is pushing so hard for immunity is that it would be embarrassing if a court found participation in its programs was illegal. They don’t believe anyone should suffer consequences for breaking the law.

The New York Times reports on Attributor, a company tackling the broad re-use of copyrighted material online:

The company has developed software that identifies an electronic “fingerprint” for a particular piece of material — an article, a picture, a video. Then it hunts down any place across the Web where a significant chunk of that work has been copied, with or without permission.

When the use is unauthorized, Attributor’s software can automatically send a message to the site’s operators, demanding a link back to the original publisher’s site, a share of revenue from any ads on the page, or a halt to the copying.

No word on whether the software also calculates whether unauthorized uses it finds are nevertheless fair uses. That aside, this sort of searching technology should help placate the fears of content owners over the sort of orphan works legislation I’ve proposed.

Matt is clearly right that geek activists are lousy at political organizing, and Internet utopianism may lull some of us into a false sense of security. But I think that, if anything, Benkler’s writing demonstrates the opposite tendency: his pronouncements tend toward the apocalyptic. For example, he says:

I think there are certain well-defined threats to this model. If we end up with a proprietary communications platform, such as the one that the FCC’s spectrum and broadband policies are aiming to achieve; and on that platform we will have proprietary, closed platforms like the iPhone, then much of the promise of the networked environment will be lost.

Now, I’ve written before that I think Benkler overhypes the potential of a spectrum commons. I won’t belabor that point, but I think his comments about the iPhone are particularly interesting. It’s certainly true that advocates for open standards like Benkler (and me) have much to criticize in the iPhone. But it’s a mistake to view the iPhone as a step backwards for open networks without looking at the broader context.

In the first place, Apple’s attempts to lock down the iPhone have sparked an enormous customer backlash and that backlash may have spurred Apple to release an SDK for the phone. I would bet money that the iPhone will be a de facto open platform within five years, with a thriving community of third-party developers.

Continue reading →

An ACLU release issued yesterday reports that the Department of Homeland Security is telling state leaders that it will not enforce the REAL ID law.

“In discussions I participated in with the Department of Homeland Security, they were asked point blank, ‘What will happen to states that don’t participate?’” said Maine Secretary of State Matthew Dunlap, who was on the phone call with [DHS Assistant Secretary Richard] Barth. “The response was, ‘Nothing will happen. There will be no penalty. You can still get on a plane.’”

It’s hard to make out why the DHS is saying this and what it means. Most likely, Barth and the DHS are trying to shrink REAL ID down so far that they can convince a substantial number of states to announce compliance so they can claim a “successful program.” Later regulations could then grow it into the national ID it’s meant to be.

The fact that the REAL ID Act has no teeth, of course, means that states can refuse to comply entirely. There’s not even the (long known to be impotent) threat that their residents wouldn’t be able to get on planes.

Whatever the case, the program is in shambles. It would be cool if Congress were to go ahead and admit it, but nothing needs to happen for the last nail to go into REAL ID’s coffin.

Our old friend Declan McCullagh, the dean of high-tech policy journalists, has just posted an excellent column outlining his concerns with the “Do Not Track List” notion that Harper and I blasted yesterday. As usual, Declan says it better than any of us can regarding why this is such a silly and dangerous regulatory proposal:

Nobody’s holding a gun to Internet users’ heads and forcing them to visit Amazon or Yahoo. They do it because they trust those companies to take reasonable steps to protect their privacy. To insist that the feds must step in because a few vocal lobbyists and activists don’t like those steps should be insulting to Americans: it suggests that they’re too simpleminded to make their own decisions about what’s best for them and their families. (It’s similar in principle to price regulation, when special-interest lobbyists insist that prices are too high or too low and must be altered by legislative fiat.)

What makes this an even sillier debate is that there already are a wealth of ways to accomplish “Do Not Track” without the feds. This is the third principle of Internet regulation: If technology exists to solve a perceived problem, it’s probably better to encourage its use rather than ask federal agencies for more regulations or demand that the techno half-wits in Congress draft a new law.

Amen, brother. He continues:

Continue reading →

Tomorrow, at Vanderbilt Law School, I’ll join a panel discussion on The Future of Copyright, part of the Journal of Entertainment and Technology Law’s symposium, User-Generated Confusion: The Legal and Business Implications of Web 2.0. My presentation: User-Generated Content, Copyright Policy, and Blockheaded Authors. Rest assured that, though I deploy such phrases as “seizing the means of reproduction” and “the specter of copyism,” that says more about my love of wordplay than it does anything about Marxism. You can download the PowerPoint file here.

[Crossposted to Intellectual Privilege and Agoraphilia..]

Earlier today, Jim Harper raised some valid concerns about the new “Do Not Track List” that some groups are proposing be mandated by the FTC. I’d like to point out another concern with this concept. A mandatory “Do Not Track” registry creates a potentially dangerous precedent / framework for a nationwide mandatory registry of URLs of websites that some policymakers might deem objectionable in other ways beyond just spam. When I first read these two provisions on page 4 of the Do Not Track proposal, I could not help but think of how a savvy Net-censor might use them in an attempt to regulate Internet content in other ways:

“Any advertising entity that sets a persistent identifier on a user device should be required to provide to the FTC the domain names of the servers or other devices used to place the identifier.”

..and…

“Companies providing web, video, and other forms of browser applications should provide functionality (i.e., a browser feature, plugin, or extension) that allows users to import or otherwise use the Do Not Track List of domain names, keep the list up-to-date, and block domains on the list from tracking their Internet activity.”

I can easily imagine would-be Net censors using that language as a blueprint to regulate other types of online speech. For example, it could be rewritten as follows [with my additions in brackets]:

“Companies providing web, video, and other forms of browser applications should provide functionality (i.e., a browser feature, plugin, or extension) that allows users to import or otherwise use the [government-approved ] list of domain names, keep the list up-to-date, and block domains on the list [that are harmful to minors].”

Perhaps I’m just being paranoid, but because would-be Net censors have struck out on other regulatory fronts over the past 10 years, they are looking for a new framework. A mandatory Do Not Track List might give them an opening.

When Congress delegates its authority to make laws to unelected regulators, a certain bit of accountability is lost. To make up for this, the Administrative Procedure Act requires regulators to act openly and transparently. They must make publicly available the rules they are considering, must take comments from the public, and must consider these in adopting final rules. As I explain in my new paper, making something publicly available in the Twenty-First Century means putting it online. But merely putting documents online is not enough to be truly transparent. The public has to be able to easily find and access the documents and hopefully also be able to use them in the sort of innovative ways the state of the art allows.

In this installment of my series looking at the FCC’s website, we’ll take a look at the Commission’s online docket system. So what’s wrong with it?

Continue reading →