Co-authored with Adam Thierer

Why would progressives abandon the most successful progressive technology policy ever formulated?

In a recent piece in The Washington Spectator, Marc Rotenberg and Larry Irving have some harsh words for progressives’ supposed starry-eyed treatment of Internet firms and the Clinton Administration policies that helped give rise to the modern digital economy. They argue that the Internet has failed to live up to its promise in part because “[p]rogressive leaders moved away from progressive values on tech issues, and now we live with the consequences.”

But if the modern Internet we know today is truly the result of progressive’s self-repudiation, then we owe them and the Clinton Administration a debt of gratitude, not a lecture.

Unfortunately, Rotenberg and Irving take a different perspective. They criticize progressives for standing aside while “a new mantra of ‘multistakeholder engagement’” replaced traditional regulatory governance structures, unleashing a Pandora’s Box of “self-regulatory processes” that failed to keep the private sector accountable to the public.

Rotenberg and Irving are also upset that the First Amendment rights of Internet companies have received stronger support following the implementation of Section 230 of the Communications Decency Act, which was enacted by Congress in 1996 and signed into law by President Clinton as part of the Telecommunications Act of 1996.

All of this could have been avoided, they argue, if the Clinton Administration had instead embraced the creation of a National Information Infrastructure (NII) to govern the Internet. As part of its 1993 proposed “Agenda for Action,” the Clinton White House toyed with the idea that “[d]evelopment of the NII can help unleash an information revolution that will change forever the way people live, work, and interact with each other,” citing specific examples of how it would: empower people to “live almost anywhere they wanted, without foregoing opportunities for useful and fulfilling employment”; make education “available to all students, without regard to geography, distance, resources, or disability”; and permit healthcare and other social needs to be delivered “on-line, without waiting in line, when and where you needed them.” Luckily, all these things came to pass precisely because the Clinton Administration went a different route, ignoring the heavy-handed regulatory approach offered by early tech policy wonks and opting instead to embrace a different governance framework: The Framework for Global Electronic Commerce.

The 1997 Framework outlined a succinct, market-oriented vision for the Internet and the emerging digital economy. It envisioned a model of cyberspace governance that relied on multistakeholder collaboration and ongoing voluntary negotiations and agreements to find consensus on the new challenges of the information age. Policy was to be formulated in an organic, bottom-up, and fluid fashion. This was a stark and welcome break from the failed top-down technocratic regulatory regimes of the analog era, which had long held back innovation and choice in traditional communications and media sectors.

“Where governmental involvement is needed,” The Framework advised, “its aim should be to support and enforce a predictable, minimalist, consistent and simple legal environment for commerce.” The result was one of the most amazing explosions in innovation our nation and, indeed, the entire world had ever witnessed. It was precisely the flexibility of multistakeholder governance—as well as the strong support for the free flow of speech and commerce—that unleashed this tsunami of technological progress.  

It’s strange, then, that Rotenberg and Irving decry the era of “multistakeholder engagement” that the Clinton Administration Framework presaged, especially because they included similar provisions in their own frameworks. For example, in “A Public-Interest Vision of the National Information Infrastructure,” the authors specifically called for “democratic policy-making” in the governance of the emerging Internet, arguing that “[t]he public should be fully involved in policy-making for the information infrastructure.” They go even further by citing the value of “participatory design,” which emphasized iterative experimentation and information feedback loops (learning by doing) in the process of designing network standards and systems. These “[n]ew approaches,” Rotenberg and Irving argue, “combine the centralized and decentralized models, obtaining the benefits of each while avoiding their deficiencies.” Embracing “[b]oth participatory design and the experimental approach to standardization,” they concluded, would “achieve the benefits of democratic input to design and policy-making without sacrificing the technical advantages of consistency and elegance of design.”

On this point, Rotenberg and Irving are correct. Unfortunately, it seems their valuation of such processes do not apply to the regulatory structures overseeing these technologies. This is despite the “Agenda for Action” explicitly calling for the NII to “complement … the efforts of the private sector” by “work[ing] in close partnership with business, labor, academia, the public, Congress, and state and local government.” What’s more “multistakeholder” than that?

For all their lamentations of the multistakeholder process, Rotenberg and Irving engaged in that very process in the 1990s. Their proposals had their shot at convincing the Clinton Administration that a national regulatory agency governing the Internet was necessary to usher in the digital age. And in one of those ironic twists of history, they failed to get their agency, but nevertheless bore witness to the emergence of a free and open Internet where innovation and progress still flourish.

We shouldn’t lose sight of this miraculous achievement and the public policies that made it all possible. There’s nothing “progressive” about rolling back the clock in the way Rotenberg and Irving recommend. Instead, America should double-down on the Clinton Administration’s vision for innovation policy by embracing permissionless innovation, collaborative multistakeholderism, and strong support for freedom of speech as the cornerstones of public policy toward other emerging technologies and sectors.

The FCC released a proposed Order today that would create an Office of Economics and Analytics. Last April, Chairman Pai proposed this data-centric office. There are about a dozen bureaus and offices within the FCC and this proposed change in the FCC’s organizational structure would consolidate a few offices and many FCC economists and experts into a single office.

This is welcome news. Several years ago when I was in law school, I was a legal clerk for the FCC Wireless Bureau and for the FCC Office of General Counsel. During that ten-month stint, I was surprised at the number of economists, who were all excellent, at the FCC. I assisted several of them closely (and helped organize what one FCC official dubbed, unofficially, “The Economists’ Cage Match” for outside experts sparring over the competitive effects of the proposed AT&T-T-Mobile merger). However, my impression even during my limited time at the FCC was well-stated by Chairman Pai in April:

[E]conomists are not systematically incorporated into policy work at the FCC. Instead, their expertise is typically applied in an ad hoc fashion, often late in the process. There is no consistent approach to their use.

And since the economists are sprinkled about the agency, their work is often “siloed” within their respective bureau. Economics as an afterthought in telecom is not good for the development of US tech industries, nor for consumers.

As Geoffrey Manne and Allen Gibby said recently, “the future of telecom regulation is antitrust,” and the creation of the OEA is a good step in line with global trends. Many nations–like the Netherlands, Denmark, Spain, Japan, South Korea, and New Zealand–are restructuring legacy telecom regulators. The days of public and private telecom monopolies and discrete, separate communications, computer, and media industries (thus bureaus) is past. Convergence, driven by IP networks and deregulation, has created these trends and resulted in sometimes dramatic restructuring of agencies.

In Denmark, for instance, as Roslyn Layton and Joe Kane have written, national parties and regulators took inspiration from the deregulatory plans of the Clinton FCC. The Social Democrats, the Radical Left, the Left, the Conservative People’s Party, the Socialist People’s Party, and the Center Democrats agreed in 1999:

The 1990s were focused on breaking down old monopoly; now it is important to make the frameworks for telecom, IT, radio, TV meld together—convergence. We believe that new technologies will create competition.

It is important to ensure that regulation does not create a barrier for the possibility of new converged products; for example, telecom operators should be able to offer content if they so choose. It is also important to ensure digital signature capability, digital payment, consumer protection, and digital rights. Regulation must be technologically neutral, and technology choices are to be handled by the market. The goal is to move away from sector-specific regulation toward competition-oriented regulation. We would prefer to handle telecom with competition laws, but some special regulation may be needed in certain cases—for example, regulation for access to copper and universal service.

This agreement was followed up by the quiet shuttering of NITA, the Danish telecom agency, in 2011.

Bringing economic rigor to the FCC’s notoriously vague “public interest” standard seemed to be occurring (slowly) during the Clinton and Bush administrations. However, during the Obama years, this progress was de-railed, largely by the net neutrality silliness, which not only distracted US regulators from actual problems like rural broadband expansion but also reinvigorated the media-access movement, whose followers believe the FCC should have a major role in shaping US culture, media, and technologies.

Fortunately, those days are in the rearview mirror. The proposed creation of the OEA represents another pivot toward the likely future of US telecom regulation: a focus on consumer welfare, competition, and data-driven policy.

Technology policy has made major inroads into a growing number of fields in recent years, including health care, labor, and transportation, and we at the Technology Liberation Front have brought a free-market lens to these issues for over a decade. As is our annual tradition, below are the most popular posts* from the past year, as well as key excerpts.

Enjoy, and Happy New Year. Continue reading →

Reason magazine recently published my review of Franklin Foer’s new book, World Without Mind: The Existential Threat of Big Tech. My review begins as follows:

If you want to sell a book about tech policy these days, there’s an easy formula to follow.

First you need a villain. Google and Facebook should suffice, but if you can throw in Apple, Amazon, or Twitter, that’s even better. Paint their CEOs as either James Bond baddies bent on world domination or naive do-gooders obsessed with the quixotic promise of innovation.

Finally, come up with a juicy Chicken Little title. Maybe something like World Without Mind: The Existential Threat of Big Tech. Wait—that one’s taken. It’s the title of Franklin Foer’s latest book, which follows this familiar techno-panic template almost perfectly.

The book doesn’t break a lot of new ground; it serves up the same old technopanicky tales of gloom-and-doom that many others have said will befall us unless something is done to save us. But Foer’s unique contribution is to unify many diverse strands of modern tech criticism in one tome, and then amp up the volume of panic about it all. Hence, the “existential” threat in the book’s title. I bet you didn’t know the End Times were so near!

Read the rest of my review over at Reason. And, if you care to read some of my other essays on technopanics through the ages, here’s a compendium of them.

The house version of the Stop Enabling Sex Trafficking Act (SESTA), called the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), has undergone significant changes that appear to enable it to both truly address the scourge of online sex trafficking and maintain important internet liability protection that encourages a free and open internet. On Tuesday, this amended version passed the House Judiciary Committee. Like most legislation, this latest draft isn’t perfect. But it has made significant steps towards maintaining freedom online while addressing the misdeeds of a few.

Continue reading →

In 2015 after White House pressure, the FCC decided to take the radical step of classifying “broadband Internet access service” as a heavily-regulated Title II service. Title II was created for the AT&T long-distance monopoly and telegraph network and “promoting innovation and competition” is not its purpose. It’s ill-suited for the modern Internet, where hundreds of ISPs and tech companies are experimenting with new technologies and topologies.

Commissioner Brendan Carr was gracious enough to speak with Chris Koopman and me in a Mercatus podcast last week about his decision to vote to reverse the Title II classification. The podcast can be found at the Mercatus website. One highlight from Commissioner Carr:

Congress had a fork in the road. …In 1996, Congress made a decision that we’re going to head down the Title I route [for the Internet]. That decision has been one of the greatest public policy decisions that we’ve ever seen. That’s what led to the massive investment in the Internet. Over a trillion dollars invested. Consumers were protected. Innovators were free to innovate. Unfortunately, two years ago the Commission departed from that framework and moved into a very different heavy-handed regulatory world, the Title II approach.

Along those lines, in my recent ex parte meeting with Chairman Pai’s office, I pointed to an interesting 2002 study in the Review of Economics and Statistics from MIT Press about the stifling effects of Title II regulation:

[E]xisting economics scholarship suggests that a permissioned approach to new services, like that proposed in the [2015] Open Internet Order, inhibits innovation and new services in telecommunications. As a result of an FCC decision and a subsequent court decision in the late 1990s, for 18 to 30 months, depending on the firm, [Title II] carriers were deregulated and did not have to submit new offerings to the FCC for review. After the court decision, the FCC required carriers to file retroactive plans for services introduced after deregulation.

This turn of events allowed economist James Preiger to analyze and compare the rate of new services deployment in the regulated period and the brief deregulated period. Preiger found that “some otherwise profitable services are not financially viable under” the permissioned regime. Critically, the number of services carriers deployed “during the [deregulated] interim is 60%-99% larger than the model predicts they would have created” when preapproval was required. Finally, Preiger found that firms would have introduced 62% more services during the entire study period if there was no permissioned regime. This is suggestive evidence that the Order’s “Mother, May I?” approach will significantly harm the Internet services market.

Thankfully, this FCC has incorporated economic scholarship into its Restoring Internet Freedom Order and will undo the costly Title II classification for Internet services.

Over at Plain Text, I have posted a new essay entitled, “Converting Permissionless Innovation into Public Policy: 3 Reforms.” It’s a preliminary sketch of some reform ideas that I have been working on as part of my next book project. The goal is to find some creative ways to move the ball forward on the innovation policy front, regardless of what level of government we are talking about.

To maximize the potential for ongoing, positive change and create a policy environment conducive to permissionless innovation, I argue that policymakers should pursue policy reforms based on these three ideas:

  1. The Innovator’s PresumptionAny person or party (including a regulatory authority) who opposes a new technology or service shall have the burden to demonstrate that such proposal is inconsistent with the public interest.
  2. The Sunsetting ImperativeAny existing or newly imposed technology regulation should include a provision sunsetting the law or regulation within two years.
  3. The Parity ProvisionAny operator offering a similarly situated product or service should be regulated no more stringently than its least regulated competitor.

These provisions are crafted in a somewhat generic fashion in the hope that these reform proposals could be modified and adopted by various legislative or regulatory bodies. If you are interested in reading more details about each proposal, jump over to Plain Text to read the entire essay.

As I have previously written about, a bill currently up for debate in Congress runs the risk of gutting critical liability protections for internet intermediaries. Earlier today the Stop Enabling Sex Traffickers Act passed out of committee with an amendment attempted to remedy some of the most damaging changes to Section 230 in the original act. While this amendment has gained support from some industry groups, it does not fully address the concerns regarding changes to intermediary liability under Section 230. While the amended version shows increased awareness of the far reaching consequences of the act, it does not fully address issues that could have a chilling effect on speech on the internet and risk stifling future internet innovation.

Continue reading →

Tesla, Volvo, and Cadillac have all released a vehicle with features that push them beyond the standard level 2 features and nearing a level 3 “self-driving” automation system where the driver is still needs to be there, but the car can do most of the work. While there have been some notable accidents, most of these were tied to driver errors or behavior and not the technology. Still autonomous vehicles hold the promise of potentially reducing traffic accidents by more than 90% if widely adopted. However, fewer accidents and a reduction in the potential for human error in driving could change the function and formulas of the auto insurance market.

Continue reading →

Broadcast license renewal challenges have troubled libertarians and free speech advocates for decades. Despite our efforts (and our law journal articles on the abuse of the licensing process), license challenges are legal. In fact, political parties, prior FCCs, and activist groups have encouraged license challenges based on TV content to ensure broadcasters are operating in “the public interest.” Further, courts have compelled and will compel a reluctant FCC to investigate “news distortion” and other violations of FCC broadcast rules. It’s a troubling state of affairs that has been pushed back into relevancy because FCC license challenges are in the news.

In recent years the FCC, whether led by Democrats or Republicans, has preferred to avoid tricky questions surrounding license renewals. Chairman Pai, like most recent FCC chairs, has been an outspoken defender of First Amendment protections and norms. He opposed, for instance, the Obama FCC’s attempt to survey broadcast newsrooms about their coverage. He also penned an op-ed bringing attention to the fact that federal NSF funding was being used by left-leaning researchers to monitor and combat “misinformation and propaganda” on social media.

The silence of the Republican commissioners today about license renewals is likely primarily because they have higher priorities (like broadband deployment and freeing up spectrum) than intervening in the competitive media marketplace. But second, and less understood, is because whether to investigate a news station isn’t really up to them. Courts can overrule them and compel an investigation.

Political actors have used FCC licensing procedures for decades to silence political opponents and unfavorable media. For reasons I won’t explore here, TV and radio broadcasters have diminished First Amendment rights and the public is permitted to challenge their licenses at renewal time.

So, progressive “citizens groups” even in recent years have challenged license renewals for broadcasters for “one-sided programming.” Unfortunately, it works. For instance, in 2004 the promises of multi-year renewal challenges from outside groups and the risk of payback from a Democrat FCC forced broadcast stations to trim a documentary critical of John Kerry from 40 minutes to 4 minutes. And, unlike their cable counterparts, broadcasters censor nude scenes in TV and movies because even a Janet Jackson Superbowl scenario can lead to expensive license challenges.

These troubling licensing procedures and pressure points were largely unknown to most people, but, on October 11, President Trump tweeted:

“With all of the Fake News coming out of NBC and the Networks, at what point is it appropriate to challenge their License? Bad for country!”

So why hasn’t the FCC said they won’t investigate NBC and other broadcast station owners? It may be because courts can compel the FCC to investigate “news distortion.”

This is exactly what happened to the Clinton FCC. As Melody Calkins and I wrote in August about the FCC’s news distortion rule:

Though uncodified and not strictly enforced, the rule was reiterated in the FCC’s 2008 broadcast guidelines. The outline of the rule was laid out in the 1998 case Serafyn v. CBS, involving a complaint by a Ukrainian-American who alleged that the “60 Minutes” news program had unfairly edited interviews to portray Ukrainians as backwards and anti-Semitic. The FCC dismissed the complaint but DC Circuit Court reversed that dismissal and required FCC intervention. (CBS settled and the complaint was dropped before the FCC could intervene.)

The commissioners might personally wish broadcasters had full First Amendment protections and want to dismiss all challenges but current law permits and encourages license challenges. The commission can be compelled to act because of the sins of omission of prior FCCs: deciding to retain the news distortion rule and other antiquated “public interest” regulations for broadcasters. The existence of these old media rules mean the FCC’s hands are tied.