March 2012

It’s well known now that a long-simmering contest for control of the Cato Institute has bubbled over. On the last day of February, Charles and David Koch filed a lawsuit against the widow of former Cato chairman Bill Niskanen, Cato president Ed Crane, and Cato itself seeking to have Niskanen’s shares returned to Cato or granted to the remaining shareholders under the terms of a shareholder agreement. This would give the Kochs (one of whom participated in the founding of Cato) majority ownership, allowing them to elect a majority of Cato’s board. It would also position them to extinguish Crane’s shares so as to gain 100% control.

Cato disputes the Kochs legal positions, and it believes that their success “would swiftly and irrevocably damage the Cato Institute’s credibility as a non-partisan, independent advocate for free markets, individual liberty, and peace.”

The quote just above is from Cato’s “Save Cato” web page, but the more interesting commentary has been scattered by Cato staff and leadership across various blogs and outlets (e.g., Jerry Taylor, Gene Healy, Jason Kuznicki, Julian Sanchez, Jonathan Blanks, Justin Logan, Trevor Burris, Michael Cannon). There has been lots of commentary from many quarters, of course, led by Jonathan Adler at the Volokh Conspiracy. Really, there’s too much commentary to list.

A Facebook page dedicated to “saving” Cato has zoomed past 5,000 supporters.

Now it’s my turn. Putting my thoughts here on TLF is a stretch because I won’t be talking about tech. Think of this as the “liberation” part of Tech Liberation Front. The reason many of my colleagues and I do what we do here is because of both Ed Crane and the Kochs, and the institutions they have built and nurtured. Now these giants in the modern liberty movement are fighting.

That’s a shame for a lot of reasons. There is the overall cause of freedom, of course, our part of which is side-tracked and sullied by the dispute. We Catoites love what we do, fighting for freedom backed by thousands of highly engaged supporters. But don’t go all analytical and forget the hundred-plus Cato staff whose livelihoods and careers are under a cloud. That’s concerning and frustrating, especially for the people with children. Once or twice, I’ve let my colleagues know when I found their arguments overwrought. That personal dimension might be why.

Yes, Cato people are people. And so are Koch people. This is important to surface as part of the theme I want to focus on: miscalculation. Continue reading →

Imagine the following scenario. The government passes a law that includes regulations governing “transactional consent” for retail commerce. These regulations stipulate how buyers and sellers of various goods shall do business. Some of the rules give the sellers special rights to demand that the stores carry some of their goods as well as rules stipulating that stores not carry the goods of competing sellers from other markets. On the flip side, other preexisting rules give buyers the right to demand that certain sellers deal their goods to them at regulated rates.

Now, it’s true that a contractual negotiation takes place in this “marketplace” governed by “transactional consent” regulations, but does this sound like a truly free market to you? Most of us would say No.

Regrettably, that’s the essential error that the American Conservative Union (ACU) makes in a letter they sent to members of Congress this week in which they made the case against H.R. 3675 and S. 2008, “The Next Generation Television Marketplace Act.” That bill, which is sponsored by Senator Jim DeMint (R-SC) and Rep. Steve Scalise (R-LA), represents a comprehensive attempt to deregulate America’s heavily regulated video marketplace. In a recent Forbes oped, I argued that the DeMint-Scalise effort would take us “Toward a True Free Market in Television Programming” by eliminating a litany of archaic media regulations that should have never been on the books to begin with. The measure would:

  • eliminate: “retransmission consent” regulations (rules governing contractual negotiations for content);
  • end “must carry” mandates (the requirement that video distributors carry broadcast signals even if they don’t want to);
  • repeal “network non-duplication” and “syndicated exclusivity” regulations (rules that prohibit distributors from striking deals with broadcasters outside their local communities);
  • end various media ownership regulations; and
  • end the compulsory licensing requirements of the Copyright Act of 1976, which essentially forced a “duty to deal” upon content owners to the benefit of video distributors.

Despite these clearly deregulatory provisions, in its letter to Capitol Hill, the ACU argues that the DeMint-Scalise bill would somehow interfere with what they regard as a free market in video programming. The ACU writes: Continue reading →

Yesterday, the International Center for Law and Economics and TechFreedom jointly filed comments [pdf] with the FCC on the Verizon SpectrumCo deal.  In the comments, ICLE Executive Director Geoffrey Manne and TechFreedom President Berin Szoka counter the primary arguments against the deal:

Critics lament the concentration of spectrum in the hands of one of the industry’s biggest players, but the assumption that concentration will harm to consumers is unsupported and misplaced.  Concentration of spectrum has not slowed the growth of the market; rather, the problem is that growth in demand has dramatically outpaced capacity.  What’s more: prices have plummeted even as the industry has become more concentrated.

While the FCC undeniably has authority to review the license transfers, the argument that the separate but related commercial agreements would reduce competition is properly the province of the Department of Justice.  That argument is best measured under the antitrust laws, not by the FCC under its vague “public interest” standard.  Indeed, if the FCC can assert jurisdiction over the commercial agreements as part of its public interest review, its authority over license transfers will become a license to regulate all aspects of business.  This is a recipe for certain mischief.

The need for all competitors, including Verizon, to obtain sufficient spectrum to meet increasing demand demonstrates that the deal is in the public interest and should be approved.

On the podcast this week, Bruce Schneier, internationally renowned security expert and author, discusses his new book entitled, “Liars & Outliers: Enabling the Trust That Society Needs To Thrive.” Schneier starts the discussion by looking at society and trust and explains why he thinks the two are necessary for civilization. According to Schneier, two concepts contribute to a trustful society: first, humans are mostly moral; second, informal reputation systems incentivize trustworthy behavior. The discussion turns to technology and trust, and Schneier talks about how the information society yields greater consequences when trust is breached. He then describes how society deals with technology and trust and why he thinks the system is not perfect but working well overall.

Related Links

To keep the conversation around this episode in one place, we’d like to ask you to comment at the webpage for this episode on Surprisingly Free. Also, why not subscribe to the podcast on iTunes?

I want to highly recommend everyone watch this interesting new talk by danah boyd on “Culture of Fear + Attention Economy = ?!?!” In her talk, danah discusses “how fear gets people into a frenzy” or panic about new technologies and new forms of culture. “The culture of fear is the idea that fear can be employed by marketers, politicians, the media, and the public to really regulate the public… such that they can be controlled,” she argues. “Fear isn’t simply the product of natural forces. It can systematically be generated to entice, motivate, or suppress. It can be leveraged as a political tool and those in power have long used fear for precisely these goals.”  I discuss many of these issues in my new 80-page white paper, “Technopanics, Threat Inflation, and the Danger of an Information Technology Precautionary Principle.

Webstock ’12: danah boyd – Culture of Fear + Attention Economy = ?!?! from Webstock on Vimeo.

danah points out that new media is often leveraged to generate fear and so we should not be surprised when the Internet and digital technologies are used in much the same way. She also correctly notes that our cluttered, cacophonous information age might also be causing an escalation of fear-based tactics. “The more there are stimuli competing for your attention, the more likely it is that fear is going to be the thing that will drive your attention” to the things that some want you to notice or worry about.

I spent some time in my technopanics paper discussing this point in Section III.C (“Bad News Sells: The Role of the Media, Advocates, and the Listener.”) Here’s the relevant passage: Continue reading →

The Federal Trade Commission (FTC) has just released its final privacy framework proposal, “Protecting Consumer Privacy in an Era of Rapid Change.” The agency released a draft report with the same title back in late 2010 and then asked for comments. [Here were my comments to the agency.] The FTC’s final report comes just a month after the Obama Administration released its 50-page privacy framework, Consumer Data Privacy in a Networked World, which included a privacy “bill of rights.” That report was primarily driven by the Department of Commerce. [I penned a Forbes column about that report the day it was released.]  The new FTC report is fairly consistent with the earlier Commerce Department report.  Here are some of the key themes or recommendations from the final FTC report:

  • rooted in a set of baseline privacy principles with a strong push for “privacy by design,” more consumer choice, and better transparency.
  • along with Dept of Commerce, the agency will work with industry to develop privacy codes of conduct and then give them teeth with possibility of FTC enforcement.
  • pushes for industry to pursue voluntary “Do Not Track” mechanism, which to the agency apparently means “do not collect” any info.
  • calls on Congress to pass data security legislation and legislation “to provide greater transparency for, and control over, the practices of information brokers.” Also, “to further increase transparency, the Commission calls on data brokers that compile data for marketing purposes to explore creating a centralized website where data brokers could (1) identify themselves to consumers and describe how they collect and use consumer data and (2) detail the access rights and other choices they provide with respect to the consumer data they maintain.”
  • the agency will host a workshop later this year to discuss privacy withing “large platform providers.” The report notes: “To the extent that large platforms, such as Internet Service Providers, operating systems, browsers, and social media, seek to comprehensively track consumers’ online activities, it raises heightened privacy concerns.”
  • the agency is also stepping up oversight on mobile privacy issues.
  • the agency says it “generally supports the exploration of efforts to develop additional mechanisms, such as the ‘eraser button’ for social media,” but stops short of saying it should be mandated at this time.

Some of my initial random thoughts about the FTC report: Continue reading →

The Federal Trade Commission issued a report today calling on companies “to adopt best privacy practices.” In related news, most people support airline safety… The report also “recommends that Congress consider enacting general privacy legislation, data security and breach notification legislation, and data broker legislation.”

This is regulatory cheerleading of the same kind our government’s all-purpose trade regulator put out a dozen years ago. In May of 2000, the FTC issued a report finding “that legislation is necessary to ensure further implementation of fair information practices online” and recommending a framework for such legislation. Congress did not act on that, and things are humming along today without top-down regulation of information practices on the Internet.

By “humming along,” I don’t mean that all privacy problems have been solved. (And they certainly wouldn’t have been solved if Congress had passed a law saying they should be.) “Humming along” means that ongoing push-and-pull among companies and consumers is defining the information practices that best serve consumers in all their needs, including privacy.

Congress won’t be enacting legislation this year, and there doesn’t seem to be any groundswell for new regulation in the next Congress, though President Obama’s reelection would leave him unencumbered by future elections and so inclined to indulge the pro-regulatory fantasies of his supporters.

The folks who want regulation of the Internet in the name of privacy should explain how they will do better than Congress did with credit reporting. In forty years of regulating credit bureaus, Congress has not come up with a system that satisfies consumer advocates’ demands. I detail that government failure in my recent Cato Policy Analysis, “Reputation under Regulation: The Fair Credit Reporting Act at 40 and Lessons for the Internet Privacy Debate.”

On Monday it was my great pleasure to participate in a Cato Institute briefing on Capitol Hill about “Internet Taxation: Should States Be Allowed to Tax outside Their Borders?” Also speaking was my old friend Dan Mitchell, a senior fellow with Cato. From the event description: “State officials have spent the last 15 years attempting to devise a regime so they can force out-of-state vendors to collect sales taxes, but the Supreme Court has ruled that such a cartel is not permissible without congressional approval. Congress is currently considering the Main Street Fairness Act, a bill that would authorize a multistate tax compact and force many Internet retailers to collect sales taxes for the first time. Is this sensible? Are there alternative ways to address tax “fairness” concerns in this context?”

Watch the video for our answers. Also, here’s the big Cato paper that Veronique de Rugy and I penned for Cato on this back in 2003 and here’s a shorter recent piece we did for Mercatus.

In their paper, “Loving the Cyber Bomb? The Dangers of Threat Inflation in Cybersecurity Policy,” my Mercatus Center colleagues Jerry Brito and Tate Watkins warned of the dangers of “threat inflation” in cybersecurity policy debates. In early 2011, Mercatus also published a paper by Sean Lawson, an assistant professor in the Department of Communication at the University of Utah, entitled “Beyond Cyber Doom” that documented how fear-based tactics and cyber-doom scenarios and rhetoric increasingly were on display in cybersecurity policy debates.  Finally, in my recent Mercatus Center working paper, “Technopanics, Threat Inflation, and the Danger of an Information Technology Precautionary Principle,” I extended their threat inflation analysis and developed a comprehensive framework offering additional examples of, and explanations for, threat inflation in technology policy debates.

These papers make it clear that a sort of hysteria has developed around cyberwar and cybersecurity issues. Frequent allusions are made in cybersecurity debates to the potential for a “Digital Pearl Harbor,” a “cyber cold war,” a “cyber Katrina,” or even a “cyber 9/11.” These analogies are made even though these historical incidents resulted in death and destruction of a sort not comparable to attacks on digital networks. Others refer to “cyber bombs” even though no one can be “bombed” with binary code. And new examples of such inflationary rhetoric seem to emerge each day. Continue reading →

On CNET today, I have a longish post on the FCC’s continued machinations over LightSquared and Dish Networks respective efforts to use existing satellite spectrum to build terrestrial mobile broadband networks. Both companies plan to build 4G LTE networks; LightSquared has already spent $4 billion in build-out for its network, which it plans to offer wholesale.

After first granting and then, a year later, revoking LightSquared’s waiver to repurpose its satellite spectrum, the agency has taken a more conservative (albeit slower) course with Dish. Yesterday, the agency initiated a Notice of Proposed Rulemaking that would, if adopted, assign flexible use rights to about 40 Mhz. of MSS spectrum licensed to Dish.

Current allocations of spectrum have little to do with the technical characteristics of different bands. That existing licenses limit Dish and LightSquared to satellite applications, for example, is simply an artifact of more-or-less random carve-outs to the absurdly complicated spectrum map managed by the agency since 1934. Advances in technology makes it possible to successfully use many different bands for many different purposes.

But the legacy of the FCC’s command-and-control model for allocations to favor “new” services (new, that is, until they are made obsolete in later years or decades) and shape competition to its changing whims is a confusing and unnecessary pile-up of limitations and conditions that severely and artificially limit the ways in which spectrum can be redeployed as technology and consumer demands change.  Today, the FCC sits squarely in the middle of each of over 50,000 licenses, a huge bottleneck that is making the imminent spectrum crisis in mobile broadband even worse. Continue reading →