Ryan Singel at the always-excellent Threat Level blog debunks the latest lies about wiretapping laws getting Americans killed in Iraq. The story claims that it took soldiers in Iraq 10 hours to get the necessary legal permission to wiretap the cell phones of terrorists who had kidnapped American soldiers.
As Singel points out, there are a bunch of problems with this story. In the first place, the military doesn’t need any court approval to do wiretaps physically outside of the United States, so if they had taps on cell phone towers in Iraq (and as Singel points out, if we don’t have wiretaps on the Iraqi cell phone network, “we all deserve tax refunds”), no approval would have been necessary. Secondly, the issue never reached the FISA court, as the executive branch determined it had the authority to conduct the search without a new court order. Third, most of the delay came not from the NSA’s doing paperwork required to determine if they needed a warrant, but from delays at the DOJ, which sat on the NSA’s request for seven hours. A timeline from Rep. Reyes tells the story:
At 10:00 a.m., key U.S. agencies met to discuss and develop various options for collecting additional intelligence relating to the kidnapping by accessing certain communications
At 10:52 a.m., the NSA notified the Department of Justice (DOJ) of its desire to collect some communications that require a FISA order. It was determined that some FISA coverage already existed.
At 12:53 p.m., the NSA General Counsel agreed that all of the requirements for an emergency FISA authorization had been met for the remaining collection of the communications inside the U.S.
Collection could have started immediately – the requirements of the statute were satisfied. As James Baker, head of the FISA office has testified to Congress, emergency authorization can take place in minutes and can be granted orally.
However, the NSA played it safe and waited for the Justice Department to give the go ahead. How long could that take?
Continue reading →
Andrew Keen is the web’s favorite whipping boy these days, and in some ways he has it coming. His latest book, The Cult of the Amateur: How Today’s Internet is Killing Our Culture, is an anti-all-things-Web 2.0 screed. Keen lambastes “Internet democracy” (specifically the Wiki model of collaborative creation) and decries the rising tide of user-generated everything. When you get right down to it, Keen’s view of the world is unapologetically techno-conservative and culturally elitist. He’s angry that there are fewer intermediaries minding the culture. As a result, he argues, “professional” media (by which he means to say “better” media) is giving way to “amateur” media (which he regards as synonymous with, well… crap).
Unsurprisingly, the blogosphere has fought back with a vengeance and called Keen every nasty name in the book. But the best and most level-headed critique of Keen’s work is still this old essay by the ever-insightful Clay Shirky. Clay’s response rightly concedes that Keen in correct in pointing out that some important things have been lost with the rise of the Internet. There certainly are fewer intermediaries filtering our culture for us, and that will sound like a great thing to many of us. But it’s important to realize that some of those mediating forces serve a valuable role. Editors, for example, play an important, but often overlooked, role in terms of improving the quality of great deal of media content of all varieties (journalism, books, movies, music, etc). The blogosphere is becoming an editor-free zone, and at times it really shows. There are times when some particularly insulting things are said or silly mistakes are made that probably would have been corrected had a good editor been responsible for overseeing the final product.
On the other hand, the unfiltered Web 2.0 experience is wonderfully refreshing. Sometimes it’s nice to see what the uninhibited exchange of ideas results in. Regardless, the bottom line is that the editing profession (broadly defined) is changing because of the Internet. That is undeniable. And other mediating forces or institutions are seeing their power or relative importance in the cultural creation process diminished as the Internet-spawned disintermediation continues unabated.
Will that create short term problems? Undeniably. But Keen thinks these developments are contributing to a sort of cultural catastrophe and that we are collectively much worse off because of this disintermediation and empowerment of the “amateur.” This goes much too far in my opinion.
Continue reading →
It’s long been conventional wisdom that a Hillary Clinton presidential administration would quickly move to adopt net neutrality regulation. Now that conventional wisdom has been cast into doubt. Although Sen. Clinton has supported net neutrality legislation in Congress, the idea was noticeably absent from the “innovation agenda” she announced in late August. The absence has caused some — perhaps belated — consternation on the net neutrality Left — with a post last week by Matt Stoller on Open Left asking “Where’s Your Net Neutrality Proposal Senator Clinton?” Stoller warns: “If anyone has illusions about how horrific Clinton will be as a President, disabuse yourself now.”
Strong words. Maybe its just election-season hyperbole. And maybe a Clinton neutrality proposal is still in the cards (Clinton did after all label the innovation agenda “version 1.0”.)
Still, one can’t help but sense a bit of panic on the left as the neutrality army continues to fray.
One more thing to which you should stay tuned.
(Thanks to Scott Cleland for the heads up.)
I was very interested to read Solveig’s recent discussion of copyright issues and the justice of file sharing. It seems to me that her line of arguments runs contrary to a core insight of libertarian theory, best articulated by Robert Nozick, that a just outcome is one that emerges from a series of just transactions. Nozick endorsed what he called historical theories of justice, contrasting them with patterned theories such Rawls’s Difference Principle. Libertarians have always been wary of starting with a desired social result (i.e. “everyone should have affordable health care”) and then reasoning backwards to derive a set of legal rules we think will achieve that outcome (i.e. Every employer shall provide health insurance to his employees,” “no hospital shall turn away an emergency room patient due to inability to pay”). That’s partly because we have an instinctive aversion to telling other people how they should live their lives, but just as importantly it’s because we we’re aware that these sorts of cause-and-effect predictions are extraordinarily difficult to make. Libertarians are constantly explaining the various clever and non-coercive mechanisms people develop to solve collective action problems that economic theory says can only be solved by government action.
For example, in the Abigail Alliance case, libertarians’ sympathies were with the plaintiffs, who assert that terminally ill patients have an inalienable right to experiment with unapproved but potentially life-saving drugs. FDA bureaucrats countered that, in essence, they needed the power to condemn certain people to death to ensure the integrity of their clinical testing program. Now, despite the prejudicial way I just described it, the FDA’s argument isn’t completely crazy. It really is easier to design statistically rigorous clinical trials if they can be assured that anyone they reject will not be able to get access to experimental drugs through other channels. And it’s at least possible that in the long run, ensuring the integrity of the current system of clinical trials will save lives on net.
Continue reading →
Needing a critter fix, I hauled The Grub, now three, off to the National Wildlife Visitor’s Center this weekend for their festival. Actual wildlife was promised, and they had splendid owls and turtles, looking rather sleepy. On the whole, though, I find that the direction that such organizations has taken in their public presentations to be both uninteresting and depressing. I learned little about the behavior, habits, and lives of the critters being studied, and a great deal about their habitats and the destruction thereof.
There are good reasons for the focus on systems. Although the naturalists would not put it this way: The environment is a commons of sorts; as such, it is likely to be degraded, with no one properly internalizing the costs–it needs a fix at the systemic level. But I already knew this; I wanted to learn more about the critters. I like critters. Too much. I am an old school amateur naturalist of the sort that made such a disaster of federal forest management–putting out forest fires when they ought to be allowed to burn off the brush and bugs, because I am not willing to see a racoon’s toes be singed. But the result of decades of such a policy is a sick forest that ultimately burns so fiercely it cannot be controlled at all.
Copyright debates strike me as suffering from the opposite defect. We hear a great
Continue reading →
Last week was a whirlwind of activity for the telecommunications, media and technology project with which I had been engaged since August 2006.
The folks at the Berkman Center for Internet and Society at Harvard were kind enough to invite me to speak in their luncheon series on Tuesday, October 9. I discussed “Media Tracker, FCC Watch, and the Politics of Telecom, Media and Technology.” I’m happy to report that the event is now archived on Media Berkman as a webcast.
I spoke about the work of the “Well Connected” Project at the Center for Public Integrity for which I was responsible. I devoted most of my time in the lecture to the Media Tracker, the interactive database at the heart of the project. The Media Tracker combines data from publicly available sources in a new and unique way, mapping out media and telecom ownership at the ZIP code level. Ownership is linked to lobbying expenditures and campaign contributions by company. The level of contribution by a telecom, media or technology company to any federal candidate can be viewed – documenting who has received what from whom.
Continue reading →
Earlier this month, a Minnesota jury found a Duluth-area single mother guilty of illicit file-sharing and ordered her to pay a six-figure fine. The evidence against the defendant seemed pretty airtight, but the fine struck me as unreasonably harsh—you’d never get a $222,000 fine for your first conviction of shoplifting physical CDs.
In this week’s podcast, we’re joined by two individuals who have been following this issue closely. Eric Bangeman is the managing editor of Ars Technica. He spent a week in Minnesota covering the trial, and he gives us a first-hand account of the proceedings Debbie Rose is an IP fellow at the Association for Competitive Technology, and she gives us her perspective on the broader legal and ethical issues.
There are several ways to listen to the TLF Podcast. You can press play on the player below to listen right now, or download the MP3 file. You can also subscribe to the podcast by clicking on the button for your preferred service. And do us a favor, Digg this podcast!
Get the Flash Player to see this player.
Matt links to a Post op-ed that rightly criticizes the Bush administration for insisting on completely unfettered wiretapping powers, but otherwise misses the boat on the details of the dispute between the White House and the Democratic leadership. The argument has two major problems. First, we’ve got this:
The administration says that FISA wasn’t intended to cover the collection of intelligence information overseas. That is correct, but many of the communications are being intercepted in the United States and, more important, may involve U.S. citizens. In that situation, and with telephone and e-mail communications between the U.S. and foreign countries far more common than when FISA was enacted in 1978, it is reasonable to bring the court into the picture. The measure strikes an appropriate balance between the demands of some civil liberties groups for individualized warrants and the administration’s desire for sweeping authority.
The phrase “bring the courts into the picture” makes it sound like court oversight for domestic-to-foreign communication is a new idea. But in fact it’s not—it’s the way FISA has worked since it was enacted. If you wanted to install a wiretap on American soil, you had to get a FISA warrant, regardless of whether the other end of the line was overseas or not. The question isn’t whether we should “bring the courts into the picture.” The question is whether we should cut the courts out of the oversight role they’ve played successfully for the last 30 years.
Continue reading →
The New York Times has come up with a nifty online feature — a presidential debate analyser that allows you to see word-by-word what the candidates at this week’s GOP presidential debate said about what, and how many times.
It’s worth a look. Those looking for statements by the contenders on Internet or telecommunications policy, however, are likely to be disappointed. A few quick searches reveal that — although this was bill as an economic policy debate — anything having to do with digital world seems absent. “Telecommunications”? 0 mentions. “Net neutrality”? Not there. Television? Surprisingly absent. FCC? Mentioned only once, by the moderator, Chris Matthews.
The word “Internet” was mentioned six times, after Matthews asked Rudy Giuliani how he would police the Internet culturally. He stated firmly that he wouldn’t tax the Internet (reassuring, but not really responsive), and indicated broadly that existing laws should be sufficient to police child predators and the like. Matthews persisted, asking directly whether we need a new, FCC-style agency for the Internet. Oddly, Giulianai hedged a bit on that, hinting that maybe he would if things got “worse.” Kudos to John McCain, who seemed to practically chew through his microphone at that point to say his answer was “absolutely not.”
There’s also an analyser for the September 26 Democratic debate. Internet mentions there? Zero.
The Times analyzer is a fascinating gizmo, even if the content isn’t encouraging.