Articles by Brent Skorup

Brent SkorupBrent is a senior research fellow with the Technology Policy Program at the Mercatus Center at GMU. He has an economics degree from Wheaton College and a law degree from George Mason University. Opinions are his own.


In 2015 after White House pressure, the FCC decided to take the radical step of classifying “broadband Internet access service” as a heavily-regulated Title II service. Title II was created for the AT&T long-distance monopoly and telegraph network and “promoting innovation and competition” is not its purpose. It’s ill-suited for the modern Internet, where hundreds of ISPs and tech companies are experimenting with new technologies and topologies.

Commissioner Brendan Carr was gracious enough to speak with Chris Koopman and me in a Mercatus podcast last week about his decision to vote to reverse the Title II classification. The podcast can be found at the Mercatus website. One highlight from Commissioner Carr:

Congress had a fork in the road. …In 1996, Congress made a decision that we’re going to head down the Title I route [for the Internet]. That decision has been one of the greatest public policy decisions that we’ve ever seen. That’s what led to the massive investment in the Internet. Over a trillion dollars invested. Consumers were protected. Innovators were free to innovate. Unfortunately, two years ago the Commission departed from that framework and moved into a very different heavy-handed regulatory world, the Title II approach.

Along those lines, in my recent ex parte meeting with Chairman Pai’s office, I pointed to an interesting 2002 study in the Review of Economics and Statistics from MIT Press about the stifling effects of Title II regulation:

[E]xisting economics scholarship suggests that a permissioned approach to new services, like that proposed in the [2015] Open Internet Order, inhibits innovation and new services in telecommunications. As a result of an FCC decision and a subsequent court decision in the late 1990s, for 18 to 30 months, depending on the firm, [Title II] carriers were deregulated and did not have to submit new offerings to the FCC for review. After the court decision, the FCC required carriers to file retroactive plans for services introduced after deregulation.

This turn of events allowed economist James Preiger to analyze and compare the rate of new services deployment in the regulated period and the brief deregulated period. Preiger found that “some otherwise profitable services are not financially viable under” the permissioned regime. Critically, the number of services carriers deployed “during the [deregulated] interim is 60%-99% larger than the model predicts they would have created” when preapproval was required. Finally, Preiger found that firms would have introduced 62% more services during the entire study period if there was no permissioned regime. This is suggestive evidence that the Order’s “Mother, May I?” approach will significantly harm the Internet services market.

Thankfully, this FCC has incorporated economic scholarship into its Restoring Internet Freedom Order and will undo the costly Title II classification for Internet services.

Broadcast license renewal challenges have troubled libertarians and free speech advocates for decades. Despite our efforts (and our law journal articles on the abuse of the licensing process), license challenges are legal. In fact, political parties, prior FCCs, and activist groups have encouraged license challenges based on TV content to ensure broadcasters are operating in “the public interest.” Further, courts have compelled and will compel a reluctant FCC to investigate “news distortion” and other violations of FCC broadcast rules. It’s a troubling state of affairs that has been pushed back into relevancy because FCC license challenges are in the news.

In recent years the FCC, whether led by Democrats or Republicans, has preferred to avoid tricky questions surrounding license renewals. Chairman Pai, like most recent FCC chairs, has been an outspoken defender of First Amendment protections and norms. He opposed, for instance, the Obama FCC’s attempt to survey broadcast newsrooms about their coverage. He also penned an op-ed bringing attention to the fact that federal NSF funding was being used by left-leaning researchers to monitor and combat “misinformation and propaganda” on social media.

The silence of the Republican commissioners today about license renewals is likely primarily because they have higher priorities (like broadband deployment and freeing up spectrum) than intervening in the competitive media marketplace. But second, and less understood, is because whether to investigate a news station isn’t really up to them. Courts can overrule them and compel an investigation.

Political actors have used FCC licensing procedures for decades to silence political opponents and unfavorable media. For reasons I won’t explore here, TV and radio broadcasters have diminished First Amendment rights and the public is permitted to challenge their licenses at renewal time.

So, progressive “citizens groups” even in recent years have challenged license renewals for broadcasters for “one-sided programming.” Unfortunately, it works. For instance, in 2004 the promises of multi-year renewal challenges from outside groups and the risk of payback from a Democrat FCC forced broadcast stations to trim a documentary critical of John Kerry from 40 minutes to 4 minutes. And, unlike their cable counterparts, broadcasters censor nude scenes in TV and movies because even a Janet Jackson Superbowl scenario can lead to expensive license challenges.

These troubling licensing procedures and pressure points were largely unknown to most people, but, on October 11, President Trump tweeted:

“With all of the Fake News coming out of NBC and the Networks, at what point is it appropriate to challenge their License? Bad for country!”

So why hasn’t the FCC said they won’t investigate NBC and other broadcast station owners? It may be because courts can compel the FCC to investigate “news distortion.”

This is exactly what happened to the Clinton FCC. As Melody Calkins and I wrote in August about the FCC’s news distortion rule:

Though uncodified and not strictly enforced, the rule was reiterated in the FCC’s 2008 broadcast guidelines. The outline of the rule was laid out in the 1998 case Serafyn v. CBS, involving a complaint by a Ukrainian-American who alleged that the “60 Minutes” news program had unfairly edited interviews to portray Ukrainians as backwards and anti-Semitic. The FCC dismissed the complaint but DC Circuit Court reversed that dismissal and required FCC intervention. (CBS settled and the complaint was dropped before the FCC could intervene.)

The commissioners might personally wish broadcasters had full First Amendment protections and want to dismiss all challenges but current law permits and encourages license challenges. The commission can be compelled to act because of the sins of omission of prior FCCs: deciding to retain the news distortion rule and other antiquated “public interest” regulations for broadcasters. The existence of these old media rules mean the FCC’s hands are tied.

Internet regulation advocates are trying to turn a recent FCC Notice of Inquiry about the state of US telecommunications services into a controversy. Twelve US Senators have accused the FCC of wanting to “redefin[e] broadband” in order to “abandon further efforts to connect Americans.”

Considering Chairman Pai and the Commission are already considering actions to accelerate the deployment of broadband, with new proceedings and the formation of the Broadband Deployment Advisory Committee, the allegation that the current NOI is an excuse for inaction is perplexing.

The true “controversy” is much more mundane–reasonable people disagree about what congressional neologisms like “advanced telecommunications capability” mean. The FCC must interpret and apply the indeterminate language of Section 706 of the Telecommunications Act, which requires the FCC about whether to determine “whether advanced telecommunications capability is being deployed in a reasonable and timely fashion.” If the answer is negative, the agency must “take immediate action to accelerate deployment of such capability by removing barriers to infrastructure investment and by promoting competition in the telecommunications market.” The inquiry is reported in an annual “Broadband Progress Report.” Much of the “scandal” of this proceeding is confusion about what “broadband” means.

What is broadband?

First: what qualifies as “broadband” download speed? It depends.

The OECD says anything above 256 kbps.

ITU standards set it at above 1.5 Mbps (or is 2.0 Mbps?).

In the US, broadband is generally defined as a higher speed. The USDA’s Rural Utilities Service defines it as 4.0 Mbps.

The FCC’s 2015 Broadband Progress Report found, as Obama FCC officials put it, that “the FCC’s definition of broadband” is now 25 Mbps. This is why advocates insist “broadband access” includes only wireline services above 25 Mbps.

But in the same month, the Obama FCC determined in the Open Internet Order that anything above dialup speed–56 kbps–is “broadband Internet access service.”

So, according to regulation advocates, 1.5 Mbps DSL service isn’t “broadband access” service but it is “broadband Internet access service.” Likewise a 30 Mbps 4G LTE connection isn’t a “broadband access” service but it is “broadband Internet access service.”

In other words, the word games about “broadband” are not coming from the Trump FCC. There is no consistency for what “broadband” means because prior FCCs kept changing the definition, and even use the term differently in different proceedings. As the Obama FCC said in 2009, “In previous reports to Congress, the Commission used the terms ‘broadband,’ ‘advanced telecommunications capability,’ and ‘advanced services’ interchangeably.”

Instead, what is going on is that the Trump FCC is trying to apply Section 706 to the current broadband market. The main questions are, what is advanced telecommunications capability, and is it “being deployed in a reasonable and timely fashion”?

Is mobile broadband an “advanced telecommunications capability”?

Previous FCCs declined to adopt a speed benchmark for when wireless service satisfies the “advanced telecommunications capability” definition. The so-called controversy is because the latest NOI revisits this omission in light of consumer trends. The NOI straightforwardly asks whether mobile broadband above 10 Mbps satisfies the statutory definition of “advanced telecommunications capability.”

For that, the FCC must consult the statute. Such a capability, the statute says, is technology-neutral (i.e. includes wireless and “fixed” connections) and “enables users to originate and receive high-quality voice, data, graphics, and video telecommunications.”

Historically, since the statute doesn’t provide much precision, the FCC has examined subscription rates of various broadband speeds and services. From 2010 to 2015, the Obama FCCs defined advanced telecommunications capability as a fixed connection of 4 Mbps. In 2015, as mentioned, that benchmark was raised 25 Mbps.

Regulation advocates fear that if the FCC looks at subscription rates, the agency might find that mobile broadband above 10 Mbps is an advanced telecommunications capability. This finding, they feel, would undermine the argument that the US broadband market needs intense regulation. According to recent Pew surveys, 12% of adults–about 28 million people–are “wireless only” and don’t have a wireline subscription. Those numbers certainly raise the possibility that mobile broadband is an advanced telecommunications capability.

Let’s look at the three fixed broadband technologies that “pass” the vast majority of households–cable modem, DSL, and satellite–and narrow the data to connections 10 Mbps or above.*

Home broadband connections (10 Mbps+)
Cable modem – 54.4 million
DSL – 11.8 million
Satellite – 1.4 million

It’s hard to know for sure since Pew measures adult individuals and the FCC measures households, but it’s possible more people have 4G LTE as home broadband (about 28 million adults and their families) than have 10 Mbps+ DSL as home broadband (11.8 million households).

Subscription rates aren’t the end of the inquiry, but the fact that millions of households are going mobile-only rather than DSL or cable modem is suggestive evidence that mobile broadband offers an advanced telecommunications capability. (Considering T-Mobile is now providing 50 GB of data per line per month, mobile-only household growth will likely accelerate.)

Are high-speed services “being deployed in a reasonable and timely fashion”?

The second inquiry is whether these advanced telecommunications capabilities “are being deployed in a reasonable and timely fashion.” Again, the statute doesn’t give much guidance but consumer adoption of high-speed wireline and wireless broadband has been impressive.

So few people had 25 Mbps for so long that the FCC didn’t record it in its Internet Access Services reports until 2011. At the end of 2011, 6.3 million households subscribed to 25 Mbps. Less than five years later, in June 2016, over 56 million households subscribed. In the last year alone, fixed providers extended 25 Mbps or greater speeds to 21 million households.

The FCC is not completely without guidance on this question. As part of the 2008 Broadband Data Services Improvement Act, Congress instructed the FCC to use international comparisons in its Section 706 Report. International comparisons also suggest that the US is deploying advanced telecommunications capability in a timely manner. For instance, according to the OECD the US has 23.4 fiber and cable modem connections per 100 inhabitants, which far exceeds the OECD average, 16.2 per 100 inhabitants.**

Anyways, the sky is not falling because the FCC is asking about mobile broadband subscription rates. More can be done to accelerate broadband–particularly if the government frees up more spectrum and local governments improve their permitting processes–but the Section 706 inquiry offers little that is controversial or new.

 

*Fiber and fixed wireless connections, 9.6 million and 0.3 million subscribers, respectively, are also noteworthy but these 10 Mbps+ technologies only cover certain areas of the country.

**America’s high rank in the OECD is similar if DSL is included, but the quality of DSL varies widely and often doesn’t provide 10 Mbps or 25 Mbps speeds.

By Brent Skorup and Melody Calkins

Recently, the FCC sought comments for its Media Modernization Initiative in its effort to “eliminate or modify [media] regulations that are outdated, unnecessary, or unduly burdensome.” The regulatory thicket for TV distribution has long encumbered broadcast and cable providers. These rules encourage large, homogeneous cable TV bundles and burden cable and satellite operators with high compliance costs. (See the complex web of TV regulations at the Media Metrics website.)

One reason “skinny bundles” from online video providers and cable operators are attracting consumers is that online video circumvents the FCC’s Rube Goldberg-like system altogether. The FCC should end its 50-year experiment with TV regulation, which, among other things, has raised the cost of TV and degraded the First Amendment rights of media outlets.

The proposal to eliminate legacy media rules garnered a considerable amount of support from a wide range of commenters. In our filed reply comments, we identify four regulatory rules ripe for removal:

  • News distortion. This uncodified, under-the-radar rule allows the commission to revoke a broadcasters’ license if the FCC finds that a broadcaster deliberately engages in “news distortion, staging, or slanting.” The rule traces back to the FCC’s longstanding position that it can revoke licenses from broadcast stations if programming is not “in the public interest.”

    Though uncodified and not strictly enforced, the rule was reiterated in the FCC’s 2008 broadcast guidelines. The outline of the rule was laid out in the 1998 case Serafyn v. CBS, involving a complaint by a Ukrainian-American who alleged that the “60 Minutes” news program had unfairly edited interviews to portray Ukrainians as backwards and anti-Semitic. The FCC dismissed the complaint but DC Circuit Court reversed that dismissal and required FCC intervention. (CBS settled and the complaint was dropped before the FCC could intervene.)

    “Slanted” and distorted news can be found in (unregulated) cable news, newspapers, Twitter, and YouTube. The news distortion rule should be repealed and broadcasters should have regulatory parity (and their full First Amendment rights) restored.
  • Must-carry. The rule requires cable operators to distribute the programming of local broadcast stations at broadcasters’ request. (Stations carrying relatively low-value broadcast networks seek carriage via must-carry. Stations carrying popular networks like CBS and NBC can negotiate payment from cable operators via “retransmission consent” agreements.) Must-carry was narrowly sustained by the Supreme Court in 1994 against a First Amendment challenge, on the grounds that cable operators had monopoly power in the pay-TV market. Since then, however, cable’s market share shrank from 95% to 53%. Broadcast stations have far more options for distribution, including satellite TV, telco TV, and online distribution and it’s unlikely the rules would survive a First Amendment challenge today.
  • Network nonduplication and syndicated exclusivity. These rules limit how and when broadcast programming can be distributed and allow the FCC to intervene if a cable operator breaches a contract with a broadcast station. But the (exempted) distribution of hundreds of non-broadcast channels (e.g., CNN, MTV, ESPN) show that programmers and distributors are fully capable of forming private negotiations without FCC oversight. These rules simply make licensing negotiations more difficult and invite FCC intervention.

Finally, we identify retransmission consent regulations and compulsory licenses for repeal. Because “retrans” interacts with copyright matters outside of the FCC’s jurisdiction, we encourage the FCC work with the Copyright Office in advising Congress to repeal these statutes. Cable operators dislike the retrans framework and broadcasters dislike being compelled to license programming at regulated rates. These interventions simply aren’t needed (hundreds of cable and online-only TV channels operate outside of this framework) and neither the FCC nor the Copyright Office particularly likes being the referees in these fights. The FCC should break the stalemate and approach the Copyright Office about advocating for direct licensing of broadcast TV content.

It’s becoming clearer why, for six years out of eight, Obama’s appointed FCC chairmen resisted regulating the Internet with Title II of the 1934 Communications Act. Chairman Wheeler famously did not want to go that legal route. It was only after President Obama and the White House called on the FCC in late 2014 to use Title II that Chairman Wheeler relented. If anything, the hastily-drafted 2015 Open Internet rules provide a new incentive to ISPs to curate the Internet in ways they didn’t want to before. 

The 2016 court decision upholding the rules was a Pyrrhic victory for the net neutrality movement. In short, the decision revealed that the 2015 Open Internet Order provides no meaningful net neutrality protections–it allows ISPs to block and throttle content. As the judges who upheld the Order said, “The Order…specifies that an ISP remains ‘free to offer ‘edited’ services’ without becoming subject to the rule’s requirements.” 

The 2014 White House pressure didn’t occur in a vacuum. It occurred immediately after Democratic losses in the November 2014 midterms. As Public Knowledge president Gene Kimmelman tells it, President Obama needed to give progressives “a clean victory for us to show that we are standing up for our principles.” The slapdash legal finessing that followed was presaged by President Obama’s November 2014 national address urging Title II classification of the Internet, which cites the wrong communications law on the Obama White House website to this day.

The FCC staff did their best with what they were given but the resulting Order was aimed at political symbolism and acquiring jurisdiction to regulate the Internet, not meaningful “net neutrality” protections. As internal FCC emails produced in a Senate majority report show, Wheeler’s reversal that week caught the non-partisan career FCC staff off guard. Literally overnight FCC staff had to scrap the “hybrid” (non-Title II) order they’d been carefully drafting for weeks and scrape together a legal justification for using Title II. This meant calling in advocates to enhance the record and dubious citations to the economics literature. Former FCC chief economist, Prof. Michael Katz, whose work was cited in the Order, later stated to Forbes that he suspected the “FCC cited my papers as an inside joke, because they know how much I think net neutrality is a bad idea.” 

Applying 1934 telegraph and telephone laws to the Internet was always going to have unintended consequences, but the politically-driven Order increasingly looks like an own-goal, even to supporters. Former FCC chief technologist, Jon Peha, who supports Title II classification of ISPs almost immediately raised the alarm that the Order offered “massive loopholes” to ISPs that could make the rules irrelevant. This was made clear when the FCC attorney defending the Order in court acknowledged that ISPs are free to block and filter content and escape the Open Internet regulations and Title II. These concessions from the FCC surprised even AT&T VP Hank Hultquist:

Wow. ISPs are not only free to engage in content-based blocking, they can even create the long-dreaded fast and slow lanes so long as they make their intentions sufficiently clear to customers.

So the Open Internet Order not only permits the net neutrality “nightmare scenario,” it provides an incentive to ISPs to curate the Internet. Despite the activist PR surrounding the Order, so-called “fast lanes”–like carrier-provided VoIP, VoLTE, and IPTV–have existed for years and the FCC rules allow them.  The Order permits ISP blocking, throttling, and “fast lanes”–what remains of “net neutrality”?

Prof. Susan Crawford presciently warned in 2005: 

I have lost faith in our ability to write about code in words, and I’m confident that any attempt at writing down network neutrality will be so qualified, gutted, eviscerated, and emptied that it will end up being worse than useless.

Aside from some religious ISPs, ISPs don’t want to filter Internet content. But the Obama FCC, via the “net neutrality” rules, gives them a new incentive: the Order deregulates ISPs that filter. ISPs will fight the rules because they want to continue to offer their conventional Internet service without submitting to the Title II baggage. This is why ISPs favor scrapping the Order–not only is it the FCC’s first claim to regulate Internet access, if the rules are not repealed, ISPs will be compelled to make difficult decisions about their business models and technologies in the future.

By Brent Skorup and Melody Calkins

Tech-optimists predict that drones and small aircraft may soon crowd US skies. An FAA administrator predicted that by 2020 tens of thousands of drones would be in US airspace at any one time. Further, over a dozen companies, including Uber, are building vertical takeoff and landing (VTOL) aircraft that could one day shuttle people point-to-point in urban areas. Today, low-altitude airspace use is episodic (helicopters, ultralights, drones) and with such light use, the low-altitude airspace is shared on an ad hoc basis with little air traffic management. Coordinating thousands of aircraft in low-altitude flight, however, demands a new regulatory framework.

Why not auction off low-altitude airspace for exclusive use?

There are two basic paradigms for resource use: open access and exclusive ownership. Most high-altitude airspace is lightly used and the open access regime works tolerably well because there are a small number of players (airline operators and the government) and fixed routes. Similarly, Class G airspace—which varies by geography but is generally the airspace from the surface to 700 feet above ground—is uncontrolled and virtually open access.

Valuable resources vary immensely in their character–taxi medallions, real estate, radio spectrum, intellectual property, water–and a resource use paradigm, once selected requires iteration and modification to ensure productive use. “The trick,” Prof. Richard Epstein notes, “is to pick the right initial point to reduce the stress on making these further adjustments.” If indeed dozens of operators will be vying for variable drone and VTOL routes in hundreds of local markets, exclusive use models could create more social benefits and output than open access and regulatory management. NASA is exploring complex coordination systems in this airspace but, rather than agency permissions, lawmakers should consider using property rights and the price mechanism.

The initial allocation of airspace could be determined by auction. An agency, probably the FAA, would:

  1. Identify and define geographic parcels of Class G airspace;
  2. Auction off the parcels to any party (private corporations, local governments, non-commercial stakeholders, or individual users) for a term of years with an expectation of renewal; and
  3. Permit the sale, combination, and subleasing of those parcels

The likely alternative scenario—regulatory allocation and management of airspace–derives from historical precedent in aviation and spectrum policy:

  1. First movers and the politically powerful acquire de facto control of low-altitude airspace,
  2. Incumbents and regulators exclude and inhibit newcomers and innovators,
  3. The rent-seeking and resource waste becomes unendurable for lawmakers, and
  4. Market-based reforms are slowly and haphazardly introduced.

For instance, after demand for commercial flights took off in the 1960s, a command-and-control quota system was created for crowded Northeast airports. Takeoff and landing rights, called “slots,” were assigned to early airlines but regulators did not allow airlines to sell those rights. The anticompetitive concentration and hoarding of airport slots at terminals is still being slowly unraveled by Congress and the FAA to this day. There’s a similar story for government assignment of spectrum over decades, as explained in Thomas Hazlett’s excellent new book, The Political Spectrum.

The benefit of an auction, plus secondary markets, is that the resource is generally put to its highest-valued use. Secondary markets and subleasing also permit latecomers and innovators to gain resource access despite lacking an initial assignment and political power. Further, exclusive use rights would also provide VTOL operators (and passengers) the added assurance that routes would be “clear” of potential collisions. (A more regulatory regime might provide that assurance but likely via complex restrictions on airspace use.) Airspace rights would be a new cost for operators but exclusive use means operators can economize on complex sensors, other safety devices, and lobbying costs. Operators would also possess an asset to sublease and monetize.

Another bonus (from the government’s point of view) is that the sale of Class G airspace can provide government revenue. Revenue would be slight at first but could prove lucrative once there’s substantial commercial interest. The Federal government, for instance, auctions off its usage rights for grazing, oil and gas retrieval, radio spectrum, mineral extraction, and timber harvesting. Spectrum auctions alone have raised over $100 billion for the Treasury since they began in 1994.

Guest post from Joe Kane, R Street Institute

We seldom see a cadre of deceased Founding Fathers petition the Federal Communications Commission, but this past week was an exception. All the big hitters—from George Washington to Benjamin Franklin—filed comments in favor of a free internet. Abraham Lincoln also weighed in from beyond the grave, reprising his threat “to attack with the North” if the commission doesn’t free the internet.

These dead Sons of Liberty likely are pleased that the FCC’s proposed rules take steps to protect innovation and free the internet from excessive regulation. But it shouldn’t surprise us that politicians have strong opinions. What about some figures with a broader perspective?

Jesus weighed in with forceful, if sometimes incomprehensible, views that take both sides on the commission’s Notice of Proposed Rulemaking, which seeks comment on scaling back the FCC’s 2015 decision to subject internet service to the heavy hand of Title II of the Communications Act of 1934. Satan, on the other hand, was characteristically harsher, entreating the commissioners to “rot in Florida.”

Our magical friends across the pond also chimed with some thoughts. Harry Potter, no doubt frustrated with the slow Wi-Fi at Hogwarts, seems strongly in favor of keeping Title II. His compatriot Hermione Granger, however, is more supportive of the current FCC’s efforts to move away from laws designed to regulate a now defunct telephone monopoly, perhaps because she realizes the 2015 rules won’t do much to improve internet service. Dumbledore used his comments to give a favorable evaluation of both Title II and the casting of Jude Law to portray his younger self in an upcoming film.

A few superheroes also deigned to join the discourse. Wonder Woman, Batman and Superman joined a coalition letter which made up with brevity what it lacked in substance. The same can’t be said for the FCC’s notice itself, which contains dozens of pages of analysis and seeks comments on many substantive suggestions designed to reduce regulatory burdens on infrastructure investment and the next generation of real time, internet-based services. Another, more diverse, coalition letter was joined by Morgan Freeman, Pepe the Frog, a “Mr. Dank Memes” and the Marvel villain (and Norse trickster god) Loki. It contained a transcript of Jerry Seinfeld’s Bee Movie.

Speaking of villains, Josef Stalin made known his preference that no rules be changed. But Adolf Hitler attacked Stalin’s position like it was 1941.

Then there are those with advanced degrees. Doctor Bigfoot and Doctor Who filed separate comments in support of net neutrality.

In a debate too often characterized by shrill and misleading rhetoric, it’s heartening to see the FCC’s comment process is engaging such lofty figures to substantively inform the policymaking process. I mean, it sure would be a shame if taxpayer money supporting the mandatory review of the 1,500,000+ comments in this proceeding was wasted on fake responses.

This post was originally posted at the R Street blog.

There is reporting suggesting that the Trump FCC may move to eliminate the FCC’s complex Title II regulations for the Internet and restore the FTC’s ability to police anticompetitve and deceptive practices online. This is obviously welcome news. These reports also suggest that FCC Chairman Pai and the FTC will require ISPs add open Internet principles to their terms of service, that is, no unreasonable blocking or throttling of content and no paid priority. These principles have always been imprecise because federal law allows ISPs to block objectionable content if they wish (like pornography or violent websites) and because ISPs have a First Amendment right to curate their services.

Whatever the exact wording, there shouldn’t be a per se ban of paid priority. Whatever policy develops should limit anticompetitive paid priority, not all paid priority. Paid prioritization is simply a form of consideration payment, which is economists’ term for when upstream producers pay downstream retailers or distributors for special treatment. There’s economics literature on consideration payments and it’s an accepted business practice in many other industries. Further, consideration payments often benefit small providers and niche customers. Some small and large companies with interactive IP services might be willing to pay for end-to-end service reliability.

The Open Internet Order’s paid priority ban has always been short sighted because it attempts to preserve the Internet as it existed circa 2002. It resembles the FCC’s unfounded insistence for decades that subscription TV (ie, how the vast majority of Americans consume TV today) was against “the public interest.” Like the defunct subscription TV ban, the paid priority ban is an economics-free policy that will hinder new services. 

Despite what late-night talk show hosts might say, “fast lanes” on the Internet are here and will continue. “Fast lanes” have always been permitted because, as Obama’s US CTO Aneesh Chopra noted, some emerging IP services need special treatment. Priority transmission was built into Internet protocols years ago and the OIO doesn’t ban data prioritization; it bans BIAS providers from charging “edge providers” a fee for priority.

The notion that there’s a level playing field online needing preservation is a fantasy. Non-real-time services like Netflix streaming, YouTube, Facebook pages, and major websites can mostly be “cached” on servers scattered around the US. Major web companies have their own form of paid prioritization–they spend millions annually, including large payments to ISPs, on transit agreements, CDNs, and interconnection in order to avoid congested Internet links.

The problem with a blanket paid priority ban is that it biases the evolution of the Internet in favor of these cache-able services and against real-time or interactive services like teleconferencing, live TV, and gaming. Caching doesn’t work for these services because there’s nothing to cache beforehand. 

When would paid prioritization make sense? Most likely a specialized service for dedicated users that requires end-to-end reliability. 

I’ll use a plausible example to illustrate the benefits of consideration payments online–a telepresence service for deaf people. As Martin Geddes described, a decade ago the government in Wales developed such a service. The service architects discovered that a well-functioning service had quality characteristics not supplied by ISPs. ISPs and video chat apps like Skype optimize their networks, video codecs, and services for non-deaf people (ie, most customers) and prioritize consistent audio quality over video quality. While that’s useful for most people, deaf people need basically the opposite optimization because they need to perceive subtle hand and finger motions. The typical app that prioritizes audio, not video, doesn’t work for them.

But high-def real-time video quality requires upstream and downstream capacity reservation and end-to-end reliability. This is not cheap to provide. An ISP, in this illustration, has three options–charge the telepresence provider, charge deaf customers a premium, or spread the costs across all customers. The paid priority ban means ISPs can only charge customers for increased costs. This paid priority ban unnecessarily limits the potential for such services since there may be companies or nonprofits willing to subsidize such a service.

It’s a specialized example but illustrates the idiosyncratic technical requirements needed for many real-time services. In fact, real-time services are the next big challenge in the Internet’s evolution. As streaming media expert Dan Rayburn noted, “traditional one-way live streaming is being disrupted by the demand for interactive engagement.”  Large and small edge companies are increasingly looking for low-latency video solutions. Today, a typical “live” event is broadcast online to viewers with a 15- to 45-second delay. This latency limits or kills the potential for interactive online streaming services like online talk shows, pet cams, online auctions, videogaming, and online classrooms.

If the FTC takes back oversight of ISPs and the Internet it should, as with any industry, permit any business practice that complies with competition law and consumer protection law. The agency should disregard the unfounded belief that consideration payments online (“paid priority”) are always harmful.

Congress passed joint resolutions to rescind FCC online privacy regulations this week, which President Trump is expected to sign. Ignore the hyperbole. Lawmakers are simply attempting to maintain the state of Internet privacy law that’s existed for 20-plus years.

Since the Internet was commercialized in the 1990s, the Federal Trade Commission has used its authority to prevent “unfair or deceptive acts or practices” to prevent privacy abuses by Web companies and ISPs. In 2015, that changed. The Obama FCC classified “broadband Internet access service” as a common carrier service, thereby blocking the FTC’s authority to determine which ISP privacy policies and practices are acceptable.

Privacy advocates failed to convince the Obama FTC that de-identified browsing history is “sensitive” data. (The FTC has treated SSNs, medical information, financial information, precise location, etc. as “sensitive” for years and companies must handle these differently.) The FCC was the next best thing and in 2016 they convinced the FCC to say that browsing history is “sensitive data,” but it’s sensitive only when ISPs have it.

This has contributed to a regulatory mess for consumers and tech companies. Technological convergence is here. Regulatory convergence is not.

Consider a plausible scenario. I start watching an NFL game via Twitter on my tablet on Starbucks’ wifi. I head home at halftime and watch the game from my cable TV provider, Comcast. Then I climb into bed and watch overtime on my smartphone via NFL Mobile from Verizon.

One TV program, three privacy regimes. FTC guidelines cover me at Starbucks. Privacy rules from Title VI of the Communications Act cover my TV viewing. The brand-new FCC broadband privacy rules cover my NFL Mobile viewing and late-night browsing.

Other absurdities result from the FCC’s decision to regulate Internet privacy. For instance, if you bought your child a mobile plan with web filtering, she’s protected by FTC privacy standards, while your mobile plan is governed by FCC rules. Google Fiber customers are covered by FTC policies when they use Google Search but FCC policies when they use Yelp.

This Swiss-cheese approach to classifying services means that regulatory obligations fall haphazardly across services and technologies. It’s confusing to consumers and to companies, who need to write privacy policies based on artificial FCC distinctions that consumers disregard.

The House and Senate bills rescind the FCC “notice and choice” rules, which is the first step to restoring FTC authority. (In the meantime, the FCC will implement FTC-like policies.) 

Considering that these notice and choice rules have not even gone into effect, the rehearsed outrage from advocates demands explanation: The theatrics this week are not really about congressional repeal of the (inoperative) privacy rules. Two years ago the FCC decided to regulate the Internet in order to shape Internet services and content. The leading advocates are outraged because FCC control of the Internet is slipping away. Hopefully Congress and the FCC will eliminate the rest of the Title II baggage this year.

US telecommunications laws are in need of updates. US law states that “the Internet and other interactive computer services” should be “unfettered by Federal or State regulation,” but regulators are increasingly imposing old laws and regulations onto new media and Internet services. Further, Federal Communications Commission actions often duplicate or displace general competition laws. Absent congressional action, old telecom laws will continue to delay and obstruct new services. A new Mercatus paper by Roslyn Layton and Joe Kane shows how governments can modernize telecom agencies and laws.

Legacy Laws

US telecom laws are codified in Title 47 of the US Code and enforced mostly by the FCC. That the first eight sections of US telecommunications law are devoted to the telegraph, the killer app of 1850, illustrates congressional inaction towards obsolete regulations.

In the last decade, therefore, several media, Internet, and telecom companies inadvertently stumbled into Communications Act quagmires. An Internet streaming company, for instance, was bankrupted for upending the TV status quo established by the FCC in the 1960s; FCC precedents mean broadcasters can be credibly threatened with license revocation for airing a documentary critical of a presidential candidate; and the thousands of Internet service providers across the US are subjected to laws designed to constrain the 1930s AT&T long-distance phone monopoly.

US telecom and tech laws, in other words, are a shining example of American “kludgeocracy”–a regime of prescriptive and dated laws whose complexity benefits special interests and harms innovators. These anti-consumer results led progressive Harvard professor Lawrence Lessig to conclude in 2008 that “it’s time to demolish the FCC.” While Lessig’s proposal goes too far, Congress should listen to the voices on the right and left urging them to sweep away the regulations of the past and rationalize telecom law for the 21st century.

Modern Telecom Policy in Denmark

An interesting new Mercatus working paper explains how Denmark took up that challenge. The paper, “Alternative Approaches to Broadband Policy: Lessons on Deregulation from Denmark,” is by Denmark-based scholar Roslyn Layton, who served on President Trump’s transition team for telecom policy, and Joe Kane, a masters student in the GMU econ department. 

The “Nordic model” is often caricatured by American conservatives (and progressives like Bernie Sanders) as socialist control of industry. But as AEI’s James Pethokoukis and others point out, it’s time both sides updated their 1970s talking points. “[W]hen it comes to regulatory efficiency and business freedom,” Tyler Cowen recently noted, “Denmark has a considerably higher [Heritage Foundation] score than does the U.S.”

Layton and Kane explore Denmark’s relatively free-market telecom policies. They explain how Denmark modernized its telecom laws over time as technology and competition evolved. Critically, the center-left government eliminated Denmark’s telecom regulator in 2011 in light of the “convergence” of services to the Internet. Scholars noted,

Nobody seemed to care much—except for the staff who needed to move to other authorities and a few people especially interested in IT and telecom regulation.

Even-handed, light telecom regulation performs pretty well. Denmark, along with South Korea, leads the world in terms of broadband access. The country also has a modest universal service program that depends primarily on the market. Further, similar to other Nordic countries, Denmark permitted a voluntary forum, including consumer groups, ISPs, and Google, to determine best practices and resolve “net neutrality” controversies.

Contrast Denmark’s tech-neutral, consumer-focused approach with recent proceedings in the United States. One of the Obama FCC’s major projects was attempting to regulate how TV streaming apps functioned–despite the fact that TV has never been more abundant and competitive. Countless hours of staff time and industry time were wasted (Trump’s election killed the effort) because advocates saw the opportunity to regulate the streaming market with a law intended to help Circuit City (RIP) sell a few more devices in 1996. The biggest waste of government resources has been the “net neutrality” fight, which stems from prior FCC attempts to apply 1930s telecom laws to 1960s computer systems. Old rules haphazardly imposed on new technologies creates a compliance mindset in our tech and telecom industries. Worse, these unwinnable fights over legal minutiae prevent FCC staff from working on issues where they can help consumers. 

Americans deserve better telecom laws but the inscrutability of FCC actions means consumers don’t know what to ask for. Layton and Kane illuminate that alternative frameworks are available. They highlight Denmark’s political and cultural differences from the US. Nevertheless, Denmark’s telecom reforms and pro-consumer policies deserve study and emulation. The Danes have shown how tech-neutral, consumer-focused policies not only can expand broadband access, they reduce government duplication and overreach.