Articles by Jim Harper

Jim HarperJim is the Director of Information Policy Studies at The Cato Institute, the Editor of Web-based privacy think-tank Privacilla.org, and the Webmaster of WashingtonWatch.com. Prior to becoming a policy analyst, Jim served as counsel to committees in both the House and Senate.


I’ve had quite enough of service providers of one kind or another making me part of their new social network. (I’m looking at you, Google Buzz.)

So here’s an article about how to control iTunes’ new Ping social network, which comes with iTunes 10.

Apple did a couple things right: They made it very clear in the “Terms and Conditions” click-through for the new version of the software that you’re getting Ping. It also appears to default to “off.” That’s what I found when I followed the directions in the article linked above, anyway.

If you want to be in yet another social network, you can enable Ping using those directions. You also might want to get your head checked. Something might be wrong in your real life.

If you’re in the D.C. area, come join the fun next Monday, November 15th, as the Advisory Committee on Transparency kicks off with its first event: The Future of Earmark Transparency (2:00 p.m., 2203 Rayburn House Office Building).

The Sunlight Foundation’s Daniel Schuman moderates a discussion that includes Steve Ellis of Taxpayers for Common Sense and yours truly. My WashingtonWatch.com project crowdsourced over 40,000 earmark requests last year, which we displayed on this map.

Earmarks are a hot topic right now. The new Republican Congress may make a move to ban them, but the Senate leadership may not be ready to go quite that far.

Will full-fledged earmark transparency be the compromise? It might provide a model for far more transparent processes throughout Congress.

At BIGGOVERNMENT.com, Seton Motley takes the effort to regulate Internet service provision in the name of neutrality and stomps on it with both feet.

If this were high school (and politics really sort of is), Net Neutrality would be sitting alone at lunch — shunned even by the members of the marching band and the audio-visual club. Having had its lunch money taken, it would have only enough for milk (and would sadly be unable to open the container). It would be planning to take its aunt to prom.

His brief, unkind history takes the push for Internet regulation from its bright beginnings in 2006 through a four-year-long fade. It ends with the PR catastrophe the Progressive Change Campaign Committee produced when it signed 95 Democratic candidates onto a “Network Neutrality Pledge” and they all lost.

That fiasco doesn’t reveal anything about the merits of the proposal to turn Internet service providers into federally regulated public utilities. But it is emblematic of the immaturity and amateurishness of the push for net neutrality regulation. The effort never fixed on an actual, defined problem. Instead it rotated through corporate missteps with text message services, with web sites, and sometimes with actual Internet service. The movement was long on slogans and short on concrete proposals.

Proponents of net neutrality regulation never answered the conundrum posed by “regulatory capture”—that the FCC they wanted to “control” ISPs might end up controlled by them. They never countered the point that technologists and marketplace actors would husband the behavior of ISPs, a point made ably by Tim Lee in his paper, The Durable Internet.

Motley caps off his cyberbullying of the Internet regulation effort with an Examiner piece today noting that the Progressive Change Campaign Committee raised a pitiful $300 for its efforts.

[W]ith the PCCC’s feeble efforts and Tuesday’s historic pro-freedom Congressional demographic shift – the free market, free speech assault that is Net Neutrality now lies broken on the ash heap of Internet and tech history. To which we say – good riddance to bad rubbish.

If the push for net neutrality regulation survives, it will have to regroup/grow up, identify a concrete problem and a defensible solution, and then carry that credible message beyond its own echo chamber. All in all, the movement to regulate net neutrality seems to have been “playing at” advocacy rather than seriously advocating.

The recent European Commission proposal to create a radical and likely near impossible-to-implement “right to be forgotten” provides an opportunity to do some thinking about how privacy norms should be established.

In 1961, Italian liberal philosopher and lawyer Bruno Leoni published Freedom and the Law, an excellent, if dense, rumination on law and legislation, which, as he emphasized, are quite different things.

Legislation appears today to be a quick, rational, and far-reaching remedy against every kind of evil or inconvenience, as compared with, say, judicial decisions, the settlement of disputes by private arbiters, conventions, customs, and similar kinds of spontaneous adjustments on the part of individuals. A fact that almost always goes unnoticed is that a remedy by way of legislation may be too quick to be efficacious, too unpredictably far-reaching to be wholly beneficial, and too directly connected with the contingent views and interests of a handful of people (the legislators), whoever they may be, to be, in fact, a remedy for all concerned. Even when all this is noticed, the criticism is usually directed against particular statutes rather than against legislation as such, and a new remedy is always looked for in “better” statutes instead of in something altogether different from legislation. (page 7, 1991 Liberty Fund edition)

The new Commission proposal is an example. Apparently the EU’s 1995 Data Protection Directive didn’t do it.

Rather than some central authority, it is in vernacular practice that we should discover the appropriate “common” law, emphasizes Leoni.

“[A] legal system centered on legislation resembles . . . a centralized economy in which all the relevant decisions are made by a handful of directors, whose knowledge of the whole situation is fatally limited and whose respect, if any, for the people’s wishes is subject to that limitation. No solemn titles, no pompous ceremonies, no enthusiasm on the part of the applauding masses can conceal the crude fact that both the legislators and the directors of a centralized economy are only particular individuals like you and me, ignorant of 99 percent of what is going on around them as far as the real transactions, agreements, attitudes, feelings, and convictions of people are concerned. (page 22-23, emphasis removed)

The proposed “right to be forgotten” is a soaring flight of fancy, produced by detached intellects who lack the knowledge to devise appropriate privacy norms. If it were to move forward as is, it would cripple Europe’s information economy while hamstringing international data flows. More importantly, it would deny European consumers the benefits of a modernizing economy by giving them more privacy than they probably want.

I say “probably” because I don’t know what European consumers want. I only know how to learn what they want—and that is not by observing the dictates of the people who occupy Europe’s many government bureaucracies.

Thoughts on the Election

by on November 3, 2010 · 3 comments

Tech issues don’t move the needle in national elections like yesterday’s, but below I’ll make some general observations, followed by a few on winners and losers in issue areas I cover.

All in all, I think it’s a good election result.

We’re back to divided government. The acute tension between the Republican House and Democratic Senate and president is likely to produce fiscal rectitude, and only legislation on which there is something close to true national consensus will pass.

Neither the Republicans nor the Tea Party movement were awarded any kind of sweeping victory, so they are unlikely to overplay their hands or take public support for granted. They must work to advance their aims by persuading more Americans that their philosophies and leadership are meritorious.

Democrats should, of course, be chastened. They’re rightly paying the price for the careless, go-for-broke strategy they used in the 111th Congress, to pass their sprawling, intrusive health care regulation, for example.

Here’s to at least two years of welcome gridlock.

Now, there were some notable losses among tech-focused representatives. The most worrisome loss is Senator Russ Feingold (D-WI), who has been a consistent and persistent overseer and skeptic of the growing surveillance state. I don’t see anyone to step up and take his place. Privacy lost big in the Wisconsin election.

I’m bucking consensus on the loss of Rick Boucher (D-VA) in the House, at least as far as privacy goes. (On copyright and some telecom issues, I’ll take Mike Masnick’s word.) Boucher is a nice guy and a careful legislator, but his popularity among the Washington, D.C. tech lobby, I think, was a product of lobby-legislator symbiosis, not his actual backing for the interests of tech innovators.

For at least a decade, Boucher has been an advocate of “baseline privacy legislation” that never actually had a serious chance of passing. The result was that tech lobbyists could always report to the home office that they had something to do, and tech trade associations could garner corporate support for all those noon-time strategy meetings over sandwiches—without generating a true threat to the business models of the companies they (purport to) represent.

My point is not that Boucher should have advanced his privacy legislation—it’s not going to be federal law that delivers privacy. I’m just not unhappy that he’s gone. (Not that far gone. Watch for him to take a job somewhere in the D.C. tech lobby. Knowing nothing about his plans, I’d give it a greater than 50% chance.)

The tech lobby will actually have some work to do under Boucher’s likely successor in the role of Democratic tech/consumer protection leader. Ed Markey (D-MA) is a partisan and an ideologue who will actually require the tech lobby to defend itself. He’s canny enough to have decent influence even from his perch in the minority.

UPDATE w/additional thought: Democrat Richard Blumenthal, elected to the Senate from Connecticut, is a technophobe demagogue—or plays one on TV, which is what matters. He went to war against Craigslist to boost his campaign, and his win is a notable loss for tech and free speech.

But—really—the fate of our privacy, the fate of our tech sector, and the fate of our country and society shouldn’t turn on elections. We are not defined by these people, who go to Washington, D.C. to sit atop the coercive authority machine for a while. Elections come and go. I’ll continue to work on returning power to civil society where it belongs.

Rep. Darrell Issa (R-CA) has a terrific op-ed piece on Internet-age government transparency in the Washington Times today:

If agencies used consistent data formats for their financial information, their financial reports could be electronically reconciled. It would be possible to trace funds from Congressional appropriations through agencies’ budgets to final use. The same data could flow automatically into USASpending.gov, without the errors and inconsistencies that make it unreliable today.

The idea is simple, if not easy to implement. Put government data in uniform formats, accessible to the public, and let public oversight work its will. Whether you prioritize good government, small government, or both, expect improvement.

Carl Malamud is a breakthrough thinker and doer on transparency and open government. In the brief video below, he makes the very interesting case that various regulatory codes are wrongly withheld from the public domain while citizens are expected to comply with them. It’s important, mind-opening stuff.

It seems a plain violation of due process that a person might be presumed to know laws that are not publicly available. I’m not aware of any cases finding that inability to access the law for want of money is a constitutional problem, but the situation analogizes fairly well to Harper v. Virginia, in which a poll tax that would exclude the indigent from voting was found to violate equal protection.

Regulatory codes that must be purchased at a high price will tend to cartelize trades by raising a barrier to entry against those who can’t pay for copies of the law. Private ownership of public law seems plainly inconsistent with due process, equal protection, and the rule of law. You’ll sense in the video that Malamud is no libertarian, but an enemy of an enemy of ordered liberty is a friend of liberty.

(Second in a series.)

I recently picked up a copy of Robert Wuthnow’s Be Very Afraid: The Cultural Response to Terror, Pandemics, Environmental Devastation, Nuclear Annihilation, and Other Threats. According to the dust cover, the Princeton sociologist’s book “examines the human response to existential threats…” Contrary to common belief, we do not deny such threats but “seek ways of positively meeting the threat, of doing something—anything—even if it’s wasteful and time-consuming.” Interesting batch of ideas, no?

Well, the fifth paragraph of the book joins up with some pretty obscure and disorganized writing in the introduction to disqualify it from absorbing any more of my precious time. That paragraph contains this sentence: “Millions could die from a pandemic or a dirty bomb strategically placed in a metropolitan area.”

It’s probably true that millions could die from a pandemic. Two million deaths would be just under 0.03% of the world’s population—not quite existential. But the killer for the book is Wuthnow saying that millions could die from a dirty bomb placed in a metropolitan area. There will never be that many deaths from a dirty bomb, placed anywhere, ever.

One suspects that the author doesn’t know what a dirty bomb is. A dirty bomb is a combination of conventional explosives and radioactive material that is designed to disperse the radioactive material over a wide area. A dirty bomb is not a nuclear explosive and its lethality is little greater than a conventional weapon, as the radiological material is likely to be too dispersed and too weak to cause serious health issues.

Dirty bombs are meant to scare. Incautious discussion of dirty bombs has induced more fright in our society than any actual bomb. Professor Wuthnow asserts, as fact, that a dirty bomb could kill millions, which is plainly wrong. If he doesn’t know his subject matter, he doesn’t get any more time from this reader.

Given my brief experience with the book, I advise you to be very afraid of Be Very Afraid.

The Federal Communications Commission has established a new advisory group called the “Technological Advisory Council.” Among other things it will advise the agency on “how broadband communications can be part of the solution for the delivery and cost containment of health care, for energy and environmental conservation, for education innovation and in the creation of jobs.”

This is an agency that is radically overspilling its bounds. It has established goals that it has no proper role in fulfilling and that it has no idea how to fulfill. As we look for cost-cutting measures at the federal level, we could end the pretense that communications industry should be regulated as a public utility. Shuttering the FCC would free up funds for better purposes such as lowering the national debt or reducing taxes.

Late last month, the National Research Council released a book entitled “Biometric Recognition: Challenges and Opportunities” that exposes the many difficulties with biometric identification systems. Popular culture has portrayed biometrics as nearly infallible, but it’s just not so, the report emphasizes. Especially at scale, biometrics will encounter a lot of challenges, from engineering problems to social and legal considerations.

“[N]o biometric characteristic, including DNA, is known to be capable of reliably correct individualization over the size of the world’s population,” the report says. (page 30) As with analog, in-person identification, biometrics produces a probabilistic identification (or exclusion), but not a certain one. Many biometrics change with time. Due to injury, illness, and other causes, a significant number of people do not have biometric characteristics like fingerprints and irises, requiring special accommodation.

At the scale often imagined for biometric systems, even a small number of false positives or false negatives (referred to in the report as false matches and false nonmatches) will produce considerable difficulties. “[F]alse alarms may consume large amounts of resources in situations where very few impostors exist in the system’s target population.” (page 45)

Consider a system that produces a false negative, excluding someone from access to a building, one time in a thousand. If there aren’t impostors attempting to defeat the biometric system on a regular basis, the managers of the system will quickly come to assume that the system is always mistaken when it produces a “nonmatch” and they will habituate to overruling the biometric system, rendering it impotent.

Context is everything. Biometric systems have to be engineered for particular usages, keeping the interests of the users and operators in mind, then tested and reviewed thoroughly to see if they are serving the purpose for which they’re intended. The report debunks the “magic wand” capability that has been imputed to biometrics: “[S]tating that a system is a biometric system or uses ‘biometrics’ does not provide much information about what the system is for or how difficult it is to successfully implement.” (page 60) Continue reading →