Geoffrey Stone has a great rebuttal to John Ashcroft’s op-ed in the New York Times on wiretapping:

Suppose the government asked a private security firm to commit murder or torture or rape. Would they, too, be entitled to immunity because they acted on the basis of “explicit assurances from the highest levels of the government that the activities in question were authorized by the president and determined to be lawful”? Is there a difference in principle between these situations? Perhaps in Mr. Ashcroft’s view unlawful surveillance is different because it’s just not a sufficiently serious violation of individual freedom to expect private individuals and organizations to question the legality of the government’s request. Perhaps Mr. Ashcroft would demand legislative immunity even in cases of murder, torture, and rape. I would like to know.

Second, what makes Mr. Ashcroft think that the government or the telecommunications companies could reasonably have believed in this situation that the government’s surveillance program was lawful? As a matter of fact, the clear consensus among legal and constitutional experts is that Mr. Bush’s surveillance program violated the 1978 Foreign Intelligence Surveillance Act, which expressly prohibited such conduct. Only a tiny slice of the legal profession believes that the Bush surveillance program was lawful, and almost all of them had been recruited into the Bush White House.

It was hard to pick one excerpt because it was all really good, so go read the whole thing.

In this very entertaining piece, our frequent intellectual sparring partner Tim Wu admits that certain New York City bureaucrats may be driving him to libertarianism.

I really wish Tim would become a true libertarian. As that essay and his brilliant 5-part series essays on “American Lawbreaking” for Slate illustrate, he is an incredibly gifted writer and a first-rate thinker. And, at times, his thinking does lean in the libertarian direction, but not enough to grant him credentials to the club just yet! (Tim and I also share a nerdy affection for Dungeons & Dragons, so I have to admit to liking him for that reason. I was far to familiar with 20-sided dice as a youngster. Sad, but true).

I’ve written plenty here before about the potential pitfalls associated with a la carte regulation of cable and satellite television. What troubles me most about a la carte regulatory proposals is that proponents make grandiose claims about how it would offer consumers greater “choice” and lower prices without thinking about the long-term consequences of regulation. As I pointed out in a recent editorial in the Los Angeles Daily Journal, the problem with these regulatory activists is that “Their static view of things takes the 500-channel universe for granted; they assume it will always be with us and that it’s just a question of dividing up the pie in different (and cheaper) ways.” But as I go on to explain, a la carte regulation could bring all that to an end:

To understand why [it will harm consumers], we need to consider how it is that we have gained access to a 500-channel universe of diverse viewing options on cable and satellite. All of these channels didn’t just fall like manna from heaven. Companies and investors took risks developing unique networks to suit diverse interests. Thirty years ago, few could have imagined a world of 24-hour channels devoted to cooking, home renovation, travel, weather, religion, women’s issues, and golf. Yet, today we have The Food Channel, Home & Garden TV, The Travel Channel, The Weather Channel, EWTN, Oxygen, The Golf Channel, and countless other niche networks devoted to almost every conceivable human interest. How did this happen?

The answer is “bundling.” Many niche-oriented cable networks only exist because they are bundled with stronger networks. On their own, the smaller channels can’t survive; nor would anyone have risked launching them in the first place. “Bundling” is a means for firms to cover the enormous fixed costs associated with developing TV programming while also satisfying the wide diversity of audience tastes. Bundling channels together allows the niche, specialty networks to remain viable alongside popular networks such as CNN, ESPN and TBS. Bundles, therefore, are not anticonsumer but proconsumer.

Continue reading →

The Google Public Policy blog likes S. 2321, a bill to amend the E-Government Act of 2002.

According to the Googlers, “it directs the Office of Management and Budget to create guidance and best practices for federal agencies to make their websites more accessible to search engine crawlers, and thus to citizens who rely on search engines to access information provided by their government.”

Who says everything Google says and does is interesting?

But seriously, more government transparency is better. And my effort at government transparency and public involvement shows opinion on S. 2321 running at . . . well, take a look for yourself!
Get out the vote, Google!

Update: Jerry and I seem to have written about this at about the same time. Look to him for more substance. Me, I’m just links, quotes, snark, and widgets.

Last week, Joe Lieberman and others introduce a bill in the Senate to reauthorize the E-Government Act of 2002. In my new paper about online government transparency I explain how most agencies are likely in compliance with the Act by simply putting their regulatory dockets online, even though those dockets may be largely inaccessible by the public. For example, the FCC’s online docketing system, about which I’ve been griping lately, is probably up to par as far as the Act goes.

The good news is that the reauthorization bill includes an amendment that aims to make federal websites more accessible. It reads in part:

Not later than 1 year after the date of enactment of the E-Government Reauthorization Act of 2007, the Director [of OMB] shall promulgate guidance and best practices to ensure that publicly available online Federal Government information and services are made more accessible to external search capabilities, including commercial and governmental search capabilities. The guidance and best practices shall include guidelines for each agency to test the accessibility of the websites of that agency to external search capabilities. … Effective on and after 2 years after the date of enactment of the E-Government Reauthorization Act of 2007, each agency shall ensure compliance with any guidance promulgated[.]

The purpose of these changes are to make federal sites more easily indexed by commercial search engines, such as Google, which are what most citizens use to find information. Some agencies have begun looking into this already. That is great in itself, but what really interests me here is the notion of “best practices” guidelines with which the agencies must comply. This could be the Trojan Horse that gets XML into federal sites. Once data is available in a structured format, then third parties can use it to create different (and likely better) user interfaces for the data, as well as interesting mashups.

I hope OMB will take this opportunity to revamp their e-gov efforts. Regulations.gov, a site they manage along with EPA, does not offer XML. (I’ve talked about this before here.) It also does abysmally on search engines, perhaps because they use outdated frames markup. A quick check shows Google last indexed the site in January. I sincerely hope this kick-starts things.

Crashing Techdirt

by on November 13, 2007 · 0 comments

I’m sure plenty of TLFers already read Techdirt, but in case you needed yet another reason to add it to your feed reader, two of the smartest bloggers I know—Julian and Tom—have begun contributing to the site.

Meanwhile, my contributions to TLF have been a little slower than usual as a lot of my blogging energies have been diverted over there. One of the interesting things about contributing to Techdirt has been the opportunity to branch out a little bit into the kind of pure tech/business analysis that wouldn’t really be on-topic for TLF. My latest post is a spin-off of our recent discussion of Bill Rosenblatt’s article about Radiohead and the “race to the bottom”:

The strangest thing about Rosenblatt’s article is the pejorative use of the term “race to the bottom” to describe competition in the music industry. When Apple cuts the price on the iPod, we would be really surprised to see a columnist complaining about how Apple had started a “race to the bottom” that will undermine profits among consumer electronics companies. We understand that, as painful as competition can be for producers, consumers and the economy as a whole benefit from such aggressive price-cutting. Talking about a “race to the bottom” is the language of cartels, which try to hold prices above the competitive level. Music is like any other product As the marginal costs of production and distribution fall, it’s natural that the price of music will fall as well. Smart musicians and companies will find ways to adapt and prosper in the new, more competitive marketplace. As we’ve said before, saying you can’t compete with free is saying you can’t compete at all. The sooner musicians and record labels realize that, the more prepared they’ll be when the price of music drops out from under them.

Politician Spammers

by on November 13, 2007 · 2 comments

Here’s an email I got today on behalf of Steve Kelley, a former Minnesota state senator who is now Director of the Center for Science, Technology & Public Policy at the University of Minnesota.

A Message From Steve Kelley

Dear Friends,

I have appreciated your becoming part of my Internet community by signing up to get emails during the 2006 campaigns . . .

Needless to say, I never signed up to any Steve Kelley email list.

As a legislator, Kelley was active in a variety of efforts to regulate the Internet and e-commerce, and he passed legislation regulating ISPs’ information practices and attempting to prevent spam by outlawing deceptive subject lines.

Do ya’ think Minnesota is anything close to a spam-free state? Kelley’s work is little more than surplusage in the Minnesota code.

But given Kelley’s expertise, perhaps the Center for Science, Technology & Public Policy should start a project to outlaw political and academic spamming.

Google Policy Fellowship

by on November 13, 2007 · 0 comments

Google has announced the Google Policy Fellowship – “to support students and organizations working on policy issues fundamental to the future of the Internet and its users.”

Fellows will have the opportunity to work at public interest organizations at the forefront of debates on broadband and access policy, content regulation, copyright and trademark reform, consumer privacy, open government, and more. Participating organizations are based in either Washington, DC or San Francisco, CA, and include: American Library Association, Cato Institute, Center for Democracy and Technology, Competitive Enterprise Institute, Electronic Frontier Foundation, Internet Education Foundation, Media Access Project, New America Foundation, and Public Knowledge.

Continue reading →

Good. Musician makes good. There’s an interesting article with some ideas in Spin magazine–though no clear direction emerges. Potentially useful for new artists, not so much for encouraging the re-release of Led Zeppelin (soon to be on iTunes) or old blues. If the thought of entanglement of music in a web of marketing schemes is not entirely appealing, but, well, that’s not a policy concern. What becomes of artists from unsophisticated backgrounds in this might well be… professional sports all over again?

On the prospects for live music, from Richard Morrison. (And I confess another non-policy consideration, I detest live music–one sacrifices consistent sound quality to leave the privacy of one’s home to sit or stand in crowds flaunting their absurd subcultures–but I will make grudging exceptions for metal concerts, classical guitar, and live jazz). But this, too, has its limits as a business model.

Also less encouraging is Radiohead’s experiment in whatever-it’s-worth pricing, with many electing a price of zero; the link is to Bill Rosenblatt’s report. Barry Shrum offers his perspective.

In the end, it will all get worked out. But there is no end in sight for the usefulness of copyright and technology as a tool for defining obligations in new relationships of goods, services, and persons, or as a substitutes for traditional enforcement. Continued competition of free goods with paid goods would reduce anxiety about whether producers are sensitive to consumer demand for flexible and friendly protection technology.

Two distressing trends in the overall debate, though, might well be with us forever. One is the tendency of some to see the glass of new technology as almost entirely empty, the other to see it as almost entirely full. But where old boundaries don’t hold up, new lines will be found and somehow enforced; markets go on. And where the status quo gives way, one ends up with not an end to the limitations on human endeavor peculiar to one set of economic circumstances, but a whole new set of limitation peculiar to the next. On the whole, people don’t do well without lines drawn in the sand, and will draw new ones when the last set is erased.

So What is Privacy Anyway?

by on November 11, 2007 · 0 comments

An Arsticle by Ken Fisher reviews a recent talk given by Donald Kerr, principal deputy director of National Intelligence, who is second in command to Director of National Intelligence Mike McConnell.

In a recent speech, Kerr fumbled around with privacy and related concepts, concluding in Ken’s (and an AP reporter’s) opinion that he’s trying to redefine privacy in somewhat Orwellian ways.

Here’s the meat of what Kerr said:

Continue reading →