Articles by Brent Skorup

Brent SkorupBrent is a senior research fellow with the Technology Policy Program at the Mercatus Center at GMU. He has an economics degree from Wheaton College and a law degree from George Mason University. Opinions are his own.


Internet regulation advocates lost their fight at the FCC, which voted in December 2017 to rescind the 2015 Open Internet Order. Regulation advocates have now taken their “net neutrality” regulations to the states.

Some state officials–via procurement contracts, executive order, or legislation–are attempting to monitor and regulate traffic management techniques and Internet service provider business models in the name of net neutrality. No one, apparently, told these officials that government-mandated net neutrality principles are dead in the US.

As the litigation over the 2015 rules showed, our national laissez faire policy towards the Internet and our First Amendment guts any attempt to enforce net neutrality. Recall that the 1996 amendments to the Communications Act announce a clear national policy about the Internet: Continue reading →

Last week the FCC commissioners voted to restructure the agency and create an Office of Economics and Analytics. Hopefully the new Office will give some rigor to the “public interest standard” that guides most FCC decisions. It’s important the FCC formally inject economics in to public interest determinations, perhaps much like the Australian telecom regulator’s “total welfare standard,” which is basically a social welfare calculation plus consideration of “broader social impacts.”

In contrast, the existing “standard” has several components and subcomponents (some of them contradictory) depending on the circumstances; that is, it’s no standard at all. As the first general counsel of the Federal Radio Commission, Louis Caldwell, said of the public interest standard, it means

as little as any phrase that the drafters of the Act could have used and still comply with the constitutional requirement that there be some standard to guide the administrative wisdom of the licensing authority.

Unfortunately, this means public interest determinations are largely shielded from serious court scrutiny. As Judge Posner said of the standard in Schurz Communications v. FCC,

So nebulous a mandate invests the Commission with an enormous discretion and correspondingly limits the practical scope of responsible judicial review.

Posner colorfully characterized FCC public interest analysis in that case:

The Commission’s majority opinion … is long, but much of it consists of boilerplate, the recitation of the multitudinous parties’ multifarious contentions, and self-congratulatory rhetoric about how careful and thoughtful and measured and balanced the majority has been in evaluating those contentions and carrying out its responsibilities. Stripped of verbiage, the opinion, like a Persian cat with its fur shaved, is alarmingly pale and thin.

Every party who does significant work before the FCC has agreed with Judge Posner’s sentiments at one time or another.

Which brings us to the Office of Economics and Analytics. Cost-benefit analysis has its limits, but economic rigor is increasingly important as the FCC turns its attention away from media regulation and towards spectrum assignment and broadband subsidies.

The worst excesses of FCC regulation are in the past where, for instance, one broadcaster’s staff in 1989 “was required to review 14,000 pages of records to compile information for one [FCC] interrogatory alone out of 299.” Or when, say, FCC staff had to sift through and consider 60,000 TV and radio “fairness” complaints in 1970. These regulatory excesses were corrected by economists (namely, Ronald Coase’s recommendation that spectrum licenses be auctioned, rather than given away for free by the FCC after a broadcast “beauty contest” hearing), but history shows that FCC proceedings spiral out of control without the agency intending it.

Since Congress gave such a nebulous standard, the FCC is always at risk of regressing. Look no further than the FCC’s meaningless “Internet conduct standard” from its 2015 Open Internet Order. This “net neutrality” regulation is a throwback to the bad old days, an unpredictable conduct standard that–like the Fairness Doctrine–would constantly draw the FCC into social policy activism and distract companies with interminable FCC investigations and unknowable compliance requirements.

In the OIO’s mercifully short life, we saw glimpses of the disputes that would’ve distracted the agency and regulated companies. For instance, prominent net neutrality supporters had wildly different views about whether a common practice, “zero rating” of IP content, by T-Mobile violated the Internet conduct standard. Chairman Tom Wheeler initially called it “highly innovative and highly competitive” while Harvard professor Susan Crawford said it was “dangerous” and “malignant” and should be outlawed “immediately.” The nearly year-long FCC investigations into zero rating and the equivocal report sent a clear, chilling message to ISPs and app companies: 20 years of permissionless innovation for the Internet was long enough. Submit your new technologies and business plans to us or face the consequences.

Fortunately, by rescinding the 2015 Order and creating the new economics Office, Chairman Pai and his Republican colleagues are improving the outlook for the development of the Internet. Hopefully the Office will make social welfare calculations a critical part of the public interest standard.

There was a bold, bizarre proposal published by Axios yesterday that includes leaked documents by a “senior National Security Council official” for accelerating 5G deployment in the US. “5G” refers to the latest generation of wireless technologies, whose evolving specifications are being standardized by global telecommunications companies as we speak. The proposal highlights some reasonable concerns–the need for secure networks, the deleterious slowness in getting wireless infrastructure permits from thousands of municipalities and counties–but recommends an unreasonable solution–a government-operated, nationwide wireless network.

The proposal to nationalize some 5G equipment and network components needs to be nipped in the bud. It relies on the dated notion that centralized government management outperforms “wasteful competition.” It’s infeasible and would severely damage the US telecom and Internet sector, one of the brightest spots in the US economy. The plan will likely go nowhere but the fact it’s being circulated by administration officials is alarming.

First, a little context. In 1927, the US nationalized all radiofrequency spectrum, and for decades the government rations out dribbles of spectrum for commercial use (though much has improved since liberalization in the 1990s). To this day all spectrum is nationalized and wireless companies operate at sufferance. What this new document proposes is to make a poor situation worse.

In particular, the presentation proposes to re-nationalize 500 MHz of spectrum (the 3.7 GHz to 4.2 GHz band, which contains mostly satellite and government incumbents) and build wireless equipment and infrastructure across the country to transmit on this band. The federal government would act as a wholesaler to the commercial networks (AT&T, Verizon, T-Mobile, Sprint, etc.), who would sell retail wireless plans to consumers and businesses.

The justification for nationalizing a portion of 5G networks has a national security component and an economic component: prevent Chinese spying and beat China in the “5G race.”

The announced goals are simultaneously broad and narrow, and at severe tension.

The plan is broad in that it contemplates nationalizing part of the 5G equipment and network. However, it’s narrow in that it would nationalize only a portion of the 5G network (3.7 GHz to 4.2 GHz) and not other portions (like 600 MHz and 28 GHz). This undermines the national security purpose (assuming it’s even feasible to protect the nationalized portion) since 5G networks interconnect. It’d be like having government checkpoints on Interstate 95 but leaving all other interstates checkpoint-free.

Further, the document author misunderstands the evolutionary nature of 5G networks. 5G for awhile will be an overlay on the existing 4G LTE network, not a brand-new parallel network, as the NSC document assumes. 5G equipment will be installed on 4G LTE infrastructure in neighborhoods where capacity is strained. As Sherif Hanna, director of the 5G team at Qualcomm, noted on Twitter, in fact, “the first version of the 5G [standard]…by definition requires an existing 4G radio and core network.”

The most implausible idea in the document is a nationwide 5G network could be deployed in the next few years. Environmental and historic preservation review in a single city can take longer than that. (AT&T has battled NIMBYs and local government in San Francisco for a decade, for instance, to install a few hundred utility boxes on the public right-of-way.) The federal government deploying and maintaining hundreds of thousands 5G installations in two years from scratch is a pipe dream. And how to pay for it? The “Financing” section in the document says nothing about how the federal government will find tens of billions of dollars for nationwide deployment of a government 5G network.

The plan to nationalize a portion of 5G wireless networks and deploy nationwide is unwise and unrealistic. It would permanently damage the US broadband industry, it would antagonize city and state officials, it would raise serious privacy and First Amendment concerns, and it would require billions of new tax dollars to deploy. The released plan would also fail to ensure the network security it purports to protect. US telecom companies are lining up to pay the government for spectrum and to invest private dollars to build world-class 5G networks. If the federal government wants to accelerate 5G deployment, it should sell more spectrum and redirect existing government funding towards roadside infrastructure. Network security is a difficult problem but nationalizing networks is overkill.

Already, four out of five [update: all five] FCC commissioners have come out strongly against this plan. Someone reading the NSC proposal would get the impression that the US is sitting still while China is racing ahead on 5G. The US has unique challenges but wireless broadband deployment is probably the FCC’s highest priority. The Commission is aware of the permitting problems and formed the Broadband Deployment Advisory Committee in part for that very purpose (I’m a member). The agency, in cooperation with the Department of Commerce, is also busy looking for more spectrum to release for 5G.

Recode is reporting that White House officials are already distancing the White House from the proposal. Hopefully they will publicly reject the plan soon.

The FCC released a proposed Order today that would create an Office of Economics and Analytics. Last April, Chairman Pai proposed this data-centric office. There are about a dozen bureaus and offices within the FCC and this proposed change in the FCC’s organizational structure would consolidate a few offices and many FCC economists and experts into a single office.

This is welcome news. Several years ago when I was in law school, I was a legal clerk for the FCC Wireless Bureau and for the FCC Office of General Counsel. During that ten-month stint, I was surprised at the number of economists, who were all excellent, at the FCC. I assisted several of them closely (and helped organize what one FCC official dubbed, unofficially, “The Economists’ Cage Match” for outside experts sparring over the competitive effects of the proposed AT&T-T-Mobile merger). However, my impression even during my limited time at the FCC was well-stated by Chairman Pai in April:

[E]conomists are not systematically incorporated into policy work at the FCC. Instead, their expertise is typically applied in an ad hoc fashion, often late in the process. There is no consistent approach to their use.

And since the economists are sprinkled about the agency, their work is often “siloed” within their respective bureau. Economics as an afterthought in telecom is not good for the development of US tech industries, nor for consumers.

As Geoffrey Manne and Allen Gibby said recently, “the future of telecom regulation is antitrust,” and the creation of the OEA is a good step in line with global trends. Many nations–like the Netherlands, Denmark, Spain, Japan, South Korea, and New Zealand–are restructuring legacy telecom regulators. The days of public and private telecom monopolies and discrete, separate communications, computer, and media industries (thus bureaus) is past. Convergence, driven by IP networks and deregulation, has created these trends and resulted in sometimes dramatic restructuring of agencies.

In Denmark, for instance, as Roslyn Layton and Joe Kane have written, national parties and regulators took inspiration from the deregulatory plans of the Clinton FCC. The Social Democrats, the Radical Left, the Left, the Conservative People’s Party, the Socialist People’s Party, and the Center Democrats agreed in 1999:

The 1990s were focused on breaking down old monopoly; now it is important to make the frameworks for telecom, IT, radio, TV meld together—convergence. We believe that new technologies will create competition.

It is important to ensure that regulation does not create a barrier for the possibility of new converged products; for example, telecom operators should be able to offer content if they so choose. It is also important to ensure digital signature capability, digital payment, consumer protection, and digital rights. Regulation must be technologically neutral, and technology choices are to be handled by the market. The goal is to move away from sector-specific regulation toward competition-oriented regulation. We would prefer to handle telecom with competition laws, but some special regulation may be needed in certain cases—for example, regulation for access to copper and universal service.

This agreement was followed up by the quiet shuttering of NITA, the Danish telecom agency, in 2011.

Bringing economic rigor to the FCC’s notoriously vague “public interest” standard seemed to be occurring (slowly) during the Clinton and Bush administrations. However, during the Obama years, this progress was de-railed, largely by the net neutrality silliness, which not only distracted US regulators from actual problems like rural broadband expansion but also reinvigorated the media-access movement, whose followers believe the FCC should have a major role in shaping US culture, media, and technologies.

Fortunately, those days are in the rearview mirror. The proposed creation of the OEA represents another pivot toward the likely future of US telecom regulation: a focus on consumer welfare, competition, and data-driven policy.

Technology policy has made major inroads into a growing number of fields in recent years, including health care, labor, and transportation, and we at the Technology Liberation Front have brought a free-market lens to these issues for over a decade. As is our annual tradition, below are the most popular posts* from the past year, as well as key excerpts.

Enjoy, and Happy New Year. Continue reading →

In 2015 after White House pressure, the FCC decided to take the radical step of classifying “broadband Internet access service” as a heavily-regulated Title II service. Title II was created for the AT&T long-distance monopoly and telegraph network and “promoting innovation and competition” is not its purpose. It’s ill-suited for the modern Internet, where hundreds of ISPs and tech companies are experimenting with new technologies and topologies.

Commissioner Brendan Carr was gracious enough to speak with Chris Koopman and me in a Mercatus podcast last week about his decision to vote to reverse the Title II classification. The podcast can be found at the Mercatus website. One highlight from Commissioner Carr:

Congress had a fork in the road. …In 1996, Congress made a decision that we’re going to head down the Title I route [for the Internet]. That decision has been one of the greatest public policy decisions that we’ve ever seen. That’s what led to the massive investment in the Internet. Over a trillion dollars invested. Consumers were protected. Innovators were free to innovate. Unfortunately, two years ago the Commission departed from that framework and moved into a very different heavy-handed regulatory world, the Title II approach.

Along those lines, in my recent ex parte meeting with Chairman Pai’s office, I pointed to an interesting 2002 study in the Review of Economics and Statistics from MIT Press about the stifling effects of Title II regulation:

[E]xisting economics scholarship suggests that a permissioned approach to new services, like that proposed in the [2015] Open Internet Order, inhibits innovation and new services in telecommunications. As a result of an FCC decision and a subsequent court decision in the late 1990s, for 18 to 30 months, depending on the firm, [Title II] carriers were deregulated and did not have to submit new offerings to the FCC for review. After the court decision, the FCC required carriers to file retroactive plans for services introduced after deregulation.

This turn of events allowed economist James Preiger to analyze and compare the rate of new services deployment in the regulated period and the brief deregulated period. Preiger found that “some otherwise profitable services are not financially viable under” the permissioned regime. Critically, the number of services carriers deployed “during the [deregulated] interim is 60%-99% larger than the model predicts they would have created” when preapproval was required. Finally, Preiger found that firms would have introduced 62% more services during the entire study period if there was no permissioned regime. This is suggestive evidence that the Order’s “Mother, May I?” approach will significantly harm the Internet services market.

Thankfully, this FCC has incorporated economic scholarship into its Restoring Internet Freedom Order and will undo the costly Title II classification for Internet services.

Broadcast license renewal challenges have troubled libertarians and free speech advocates for decades. Despite our efforts (and our law journal articles on the abuse of the licensing process), license challenges are legal. In fact, political parties, prior FCCs, and activist groups have encouraged license challenges based on TV content to ensure broadcasters are operating in “the public interest.” Further, courts have compelled and will compel a reluctant FCC to investigate “news distortion” and other violations of FCC broadcast rules. It’s a troubling state of affairs that has been pushed back into relevancy because FCC license challenges are in the news.

In recent years the FCC, whether led by Democrats or Republicans, has preferred to avoid tricky questions surrounding license renewals. Chairman Pai, like most recent FCC chairs, has been an outspoken defender of First Amendment protections and norms. He opposed, for instance, the Obama FCC’s attempt to survey broadcast newsrooms about their coverage. He also penned an op-ed bringing attention to the fact that federal NSF funding was being used by left-leaning researchers to monitor and combat “misinformation and propaganda” on social media.

The silence of the Republican commissioners today about license renewals is likely primarily because they have higher priorities (like broadband deployment and freeing up spectrum) than intervening in the competitive media marketplace. But second, and less understood, is because whether to investigate a news station isn’t really up to them. Courts can overrule them and compel an investigation.

Political actors have used FCC licensing procedures for decades to silence political opponents and unfavorable media. For reasons I won’t explore here, TV and radio broadcasters have diminished First Amendment rights and the public is permitted to challenge their licenses at renewal time.

So, progressive “citizens groups” even in recent years have challenged license renewals for broadcasters for “one-sided programming.” Unfortunately, it works. For instance, in 2004 the promises of multi-year renewal challenges from outside groups and the risk of payback from a Democrat FCC forced broadcast stations to trim a documentary critical of John Kerry from 40 minutes to 4 minutes. And, unlike their cable counterparts, broadcasters censor nude scenes in TV and movies because even a Janet Jackson Superbowl scenario can lead to expensive license challenges.

These troubling licensing procedures and pressure points were largely unknown to most people, but, on October 11, President Trump tweeted:

“With all of the Fake News coming out of NBC and the Networks, at what point is it appropriate to challenge their License? Bad for country!”

So why hasn’t the FCC said they won’t investigate NBC and other broadcast station owners? It may be because courts can compel the FCC to investigate “news distortion.”

This is exactly what happened to the Clinton FCC. As Melody Calkins and I wrote in August about the FCC’s news distortion rule:

Though uncodified and not strictly enforced, the rule was reiterated in the FCC’s 2008 broadcast guidelines. The outline of the rule was laid out in the 1998 case Serafyn v. CBS, involving a complaint by a Ukrainian-American who alleged that the “60 Minutes” news program had unfairly edited interviews to portray Ukrainians as backwards and anti-Semitic. The FCC dismissed the complaint but DC Circuit Court reversed that dismissal and required FCC intervention. (CBS settled and the complaint was dropped before the FCC could intervene.)

The commissioners might personally wish broadcasters had full First Amendment protections and want to dismiss all challenges but current law permits and encourages license challenges. The commission can be compelled to act because of the sins of omission of prior FCCs: deciding to retain the news distortion rule and other antiquated “public interest” regulations for broadcasters. The existence of these old media rules mean the FCC’s hands are tied.

Internet regulation advocates are trying to turn a recent FCC Notice of Inquiry about the state of US telecommunications services into a controversy. Twelve US Senators have accused the FCC of wanting to “redefin[e] broadband” in order to “abandon further efforts to connect Americans.”

Considering Chairman Pai and the Commission are already considering actions to accelerate the deployment of broadband, with new proceedings and the formation of the Broadband Deployment Advisory Committee, the allegation that the current NOI is an excuse for inaction is perplexing.

The true “controversy” is much more mundane–reasonable people disagree about what congressional neologisms like “advanced telecommunications capability” mean. The FCC must interpret and apply the indeterminate language of Section 706 of the Telecommunications Act, which requires the FCC about whether to determine “whether advanced telecommunications capability is being deployed in a reasonable and timely fashion.” If the answer is negative, the agency must “take immediate action to accelerate deployment of such capability by removing barriers to infrastructure investment and by promoting competition in the telecommunications market.” The inquiry is reported in an annual “Broadband Progress Report.” Much of the “scandal” of this proceeding is confusion about what “broadband” means.

What is broadband?

First: what qualifies as “broadband” download speed? It depends.

The OECD says anything above 256 kbps.

ITU standards set it at above 1.5 Mbps (or is 2.0 Mbps?).

In the US, broadband is generally defined as a higher speed. The USDA’s Rural Utilities Service defines it as 4.0 Mbps.

The FCC’s 2015 Broadband Progress Report found, as Obama FCC officials put it, that “the FCC’s definition of broadband” is now 25 Mbps. This is why advocates insist “broadband access” includes only wireline services above 25 Mbps.

But in the same month, the Obama FCC determined in the Open Internet Order that anything above dialup speed–56 kbps–is “broadband Internet access service.”

So, according to regulation advocates, 1.5 Mbps DSL service isn’t “broadband access” service but it is “broadband Internet access service.” Likewise a 30 Mbps 4G LTE connection isn’t a “broadband access” service but it is “broadband Internet access service.”

In other words, the word games about “broadband” are not coming from the Trump FCC. There is no consistency for what “broadband” means because prior FCCs kept changing the definition, and even use the term differently in different proceedings. As the Obama FCC said in 2009, “In previous reports to Congress, the Commission used the terms ‘broadband,’ ‘advanced telecommunications capability,’ and ‘advanced services’ interchangeably.”

Instead, what is going on is that the Trump FCC is trying to apply Section 706 to the current broadband market. The main questions are, what is advanced telecommunications capability, and is it “being deployed in a reasonable and timely fashion”?

Is mobile broadband an “advanced telecommunications capability”?

Previous FCCs declined to adopt a speed benchmark for when wireless service satisfies the “advanced telecommunications capability” definition. The so-called controversy is because the latest NOI revisits this omission in light of consumer trends. The NOI straightforwardly asks whether mobile broadband above 10 Mbps satisfies the statutory definition of “advanced telecommunications capability.”

For that, the FCC must consult the statute. Such a capability, the statute says, is technology-neutral (i.e. includes wireless and “fixed” connections) and “enables users to originate and receive high-quality voice, data, graphics, and video telecommunications.”

Historically, since the statute doesn’t provide much precision, the FCC has examined subscription rates of various broadband speeds and services. From 2010 to 2015, the Obama FCCs defined advanced telecommunications capability as a fixed connection of 4 Mbps. In 2015, as mentioned, that benchmark was raised 25 Mbps.

Regulation advocates fear that if the FCC looks at subscription rates, the agency might find that mobile broadband above 10 Mbps is an advanced telecommunications capability. This finding, they feel, would undermine the argument that the US broadband market needs intense regulation. According to recent Pew surveys, 12% of adults–about 28 million people–are “wireless only” and don’t have a wireline subscription. Those numbers certainly raise the possibility that mobile broadband is an advanced telecommunications capability.

Let’s look at the three fixed broadband technologies that “pass” the vast majority of households–cable modem, DSL, and satellite–and narrow the data to connections 10 Mbps or above.*

Home broadband connections (10 Mbps+)
Cable modem – 54.4 million
DSL – 11.8 million
Satellite – 1.4 million

It’s hard to know for sure since Pew measures adult individuals and the FCC measures households, but it’s possible more people have 4G LTE as home broadband (about 28 million adults and their families) than have 10 Mbps+ DSL as home broadband (11.8 million households).

Subscription rates aren’t the end of the inquiry, but the fact that millions of households are going mobile-only rather than DSL or cable modem is suggestive evidence that mobile broadband offers an advanced telecommunications capability. (Considering T-Mobile is now providing 50 GB of data per line per month, mobile-only household growth will likely accelerate.)

Are high-speed services “being deployed in a reasonable and timely fashion”?

The second inquiry is whether these advanced telecommunications capabilities “are being deployed in a reasonable and timely fashion.” Again, the statute doesn’t give much guidance but consumer adoption of high-speed wireline and wireless broadband has been impressive.

So few people had 25 Mbps for so long that the FCC didn’t record it in its Internet Access Services reports until 2011. At the end of 2011, 6.3 million households subscribed to 25 Mbps. Less than five years later, in June 2016, over 56 million households subscribed. In the last year alone, fixed providers extended 25 Mbps or greater speeds to 21 million households.

The FCC is not completely without guidance on this question. As part of the 2008 Broadband Data Services Improvement Act, Congress instructed the FCC to use international comparisons in its Section 706 Report. International comparisons also suggest that the US is deploying advanced telecommunications capability in a timely manner. For instance, according to the OECD the US has 23.4 fiber and cable modem connections per 100 inhabitants, which far exceeds the OECD average, 16.2 per 100 inhabitants.**

Anyways, the sky is not falling because the FCC is asking about mobile broadband subscription rates. More can be done to accelerate broadband–particularly if the government frees up more spectrum and local governments improve their permitting processes–but the Section 706 inquiry offers little that is controversial or new.

 

*Fiber and fixed wireless connections, 9.6 million and 0.3 million subscribers, respectively, are also noteworthy but these 10 Mbps+ technologies only cover certain areas of the country.

**America’s high rank in the OECD is similar if DSL is included, but the quality of DSL varies widely and often doesn’t provide 10 Mbps or 25 Mbps speeds.

By Brent Skorup and Melody Calkins

Recently, the FCC sought comments for its Media Modernization Initiative in its effort to “eliminate or modify [media] regulations that are outdated, unnecessary, or unduly burdensome.” The regulatory thicket for TV distribution has long encumbered broadcast and cable providers. These rules encourage large, homogeneous cable TV bundles and burden cable and satellite operators with high compliance costs. (See the complex web of TV regulations at the Media Metrics website.)

One reason “skinny bundles” from online video providers and cable operators are attracting consumers is that online video circumvents the FCC’s Rube Goldberg-like system altogether. The FCC should end its 50-year experiment with TV regulation, which, among other things, has raised the cost of TV and degraded the First Amendment rights of media outlets.

The proposal to eliminate legacy media rules garnered a considerable amount of support from a wide range of commenters. In our filed reply comments, we identify four regulatory rules ripe for removal:

  • News distortion. This uncodified, under-the-radar rule allows the commission to revoke a broadcasters’ license if the FCC finds that a broadcaster deliberately engages in “news distortion, staging, or slanting.” The rule traces back to the FCC’s longstanding position that it can revoke licenses from broadcast stations if programming is not “in the public interest.”

    Though uncodified and not strictly enforced, the rule was reiterated in the FCC’s 2008 broadcast guidelines. The outline of the rule was laid out in the 1998 case Serafyn v. CBS, involving a complaint by a Ukrainian-American who alleged that the “60 Minutes” news program had unfairly edited interviews to portray Ukrainians as backwards and anti-Semitic. The FCC dismissed the complaint but DC Circuit Court reversed that dismissal and required FCC intervention. (CBS settled and the complaint was dropped before the FCC could intervene.)

    “Slanted” and distorted news can be found in (unregulated) cable news, newspapers, Twitter, and YouTube. The news distortion rule should be repealed and broadcasters should have regulatory parity (and their full First Amendment rights) restored.
  • Must-carry. The rule requires cable operators to distribute the programming of local broadcast stations at broadcasters’ request. (Stations carrying relatively low-value broadcast networks seek carriage via must-carry. Stations carrying popular networks like CBS and NBC can negotiate payment from cable operators via “retransmission consent” agreements.) Must-carry was narrowly sustained by the Supreme Court in 1994 against a First Amendment challenge, on the grounds that cable operators had monopoly power in the pay-TV market. Since then, however, cable’s market share shrank from 95% to 53%. Broadcast stations have far more options for distribution, including satellite TV, telco TV, and online distribution and it’s unlikely the rules would survive a First Amendment challenge today.
  • Network nonduplication and syndicated exclusivity. These rules limit how and when broadcast programming can be distributed and allow the FCC to intervene if a cable operator breaches a contract with a broadcast station. But the (exempted) distribution of hundreds of non-broadcast channels (e.g., CNN, MTV, ESPN) show that programmers and distributors are fully capable of forming private negotiations without FCC oversight. These rules simply make licensing negotiations more difficult and invite FCC intervention.

Finally, we identify retransmission consent regulations and compulsory licenses for repeal. Because “retrans” interacts with copyright matters outside of the FCC’s jurisdiction, we encourage the FCC work with the Copyright Office in advising Congress to repeal these statutes. Cable operators dislike the retrans framework and broadcasters dislike being compelled to license programming at regulated rates. These interventions simply aren’t needed (hundreds of cable and online-only TV channels operate outside of this framework) and neither the FCC nor the Copyright Office particularly likes being the referees in these fights. The FCC should break the stalemate and approach the Copyright Office about advocating for direct licensing of broadcast TV content.

It’s becoming clearer why, for six years out of eight, Obama’s appointed FCC chairmen resisted regulating the Internet with Title II of the 1934 Communications Act. Chairman Wheeler famously did not want to go that legal route. It was only after President Obama and the White House called on the FCC in late 2014 to use Title II that Chairman Wheeler relented. If anything, the hastily-drafted 2015 Open Internet rules provide a new incentive to ISPs to curate the Internet in ways they didn’t want to before. 

The 2016 court decision upholding the rules was a Pyrrhic victory for the net neutrality movement. In short, the decision revealed that the 2015 Open Internet Order provides no meaningful net neutrality protections–it allows ISPs to block and throttle content. As the judges who upheld the Order said, “The Order…specifies that an ISP remains ‘free to offer ‘edited’ services’ without becoming subject to the rule’s requirements.” 

The 2014 White House pressure didn’t occur in a vacuum. It occurred immediately after Democratic losses in the November 2014 midterms. As Public Knowledge president Gene Kimmelman tells it, President Obama needed to give progressives “a clean victory for us to show that we are standing up for our principles.” The slapdash legal finessing that followed was presaged by President Obama’s November 2014 national address urging Title II classification of the Internet, which cites the wrong communications law on the Obama White House website to this day.

The FCC staff did their best with what they were given but the resulting Order was aimed at political symbolism and acquiring jurisdiction to regulate the Internet, not meaningful “net neutrality” protections. As internal FCC emails produced in a Senate majority report show, Wheeler’s reversal that week caught the non-partisan career FCC staff off guard. Literally overnight FCC staff had to scrap the “hybrid” (non-Title II) order they’d been carefully drafting for weeks and scrape together a legal justification for using Title II. This meant calling in advocates to enhance the record and dubious citations to the economics literature. Former FCC chief economist, Prof. Michael Katz, whose work was cited in the Order, later stated to Forbes that he suspected the “FCC cited my papers as an inside joke, because they know how much I think net neutrality is a bad idea.” 

Applying 1934 telegraph and telephone laws to the Internet was always going to have unintended consequences, but the politically-driven Order increasingly looks like an own-goal, even to supporters. Former FCC chief technologist, Jon Peha, who supports Title II classification of ISPs almost immediately raised the alarm that the Order offered “massive loopholes” to ISPs that could make the rules irrelevant. This was made clear when the FCC attorney defending the Order in court acknowledged that ISPs are free to block and filter content and escape the Open Internet regulations and Title II. These concessions from the FCC surprised even AT&T VP Hank Hultquist:

Wow. ISPs are not only free to engage in content-based blocking, they can even create the long-dreaded fast and slow lanes so long as they make their intentions sufficiently clear to customers.

So the Open Internet Order not only permits the net neutrality “nightmare scenario,” it provides an incentive to ISPs to curate the Internet. Despite the activist PR surrounding the Order, so-called “fast lanes”–like carrier-provided VoIP, VoLTE, and IPTV–have existed for years and the FCC rules allow them.  The Order permits ISP blocking, throttling, and “fast lanes”–what remains of “net neutrality”?

Prof. Susan Crawford presciently warned in 2005: 

I have lost faith in our ability to write about code in words, and I’m confident that any attempt at writing down network neutrality will be so qualified, gutted, eviscerated, and emptied that it will end up being worse than useless.

Aside from some religious ISPs, ISPs don’t want to filter Internet content. But the Obama FCC, via the “net neutrality” rules, gives them a new incentive: the Order deregulates ISPs that filter. ISPs will fight the rules because they want to continue to offer their conventional Internet service without submitting to the Title II baggage. This is why ISPs favor scrapping the Order–not only is it the FCC’s first claim to regulate Internet access, if the rules are not repealed, ISPs will be compelled to make difficult decisions about their business models and technologies in the future.