Lawmakers frequently hear impressive-sounding stats about net neutrality like “83% of voters support keeping FCC’s net neutrality rules.” This 83% number (and similar “75% of Republicans support the rules”) is based on a survey from the Program for Public Consultation released in December 2017, right before the FCC voted to repeal the 2015 Internet regulations.

These numbers should be treated with skepticism. This survey generates these high approval numbers by asking about net neutrality “rules” found nowhere in the 2015 Open Internet Order. The released survey does not ask about the substance of the Order, like the Title II classification, government price controls online, or the FCC’s newly-created authority to approve of and disapprove of new Internet services.

Here’s how the survey frames the issue:

Under the current regulations, ISPs are required to:   

provide customers access to all websites on the internet.   

provide equal access to all websites without giving any websites faster or slower download speeds.  

The survey then essentially asks the participant if they favor these “regulations.” The nearly 400-page Order is long and complex and I’m guessing the survey creators lacked expertise in this area because this is a serious misinterpretation of the Order. This framing is how net neutrality advocates discuss the issue, but the Obama FCC’s interpretations of the 2015 Order look nothing like these survey questions. Exaggeration and misinformation is common when discussing net neutrality and unfortunately these pollsters contributed to it. (The Washington Post Fact Checker column recently assigned “Three Pinocchios” to similar net neutrality advocate claims.)

Let’s break down these rules ostensibly found in the 2015 Order.

“ISPs are required to provide customers access to all websites on the internet”

This is wrong. The Obama FCC was quite clear in the 2015 Order and during litigation that ISPs are free to filter the Internet and block websites. From the oral arguments:

FCC lawyer: “If [ISPs] want to curate the Internet…that would drop them out of the definition of Broadband Internet Access Service.”
Judge Williams: “They have that option under the Order?”
FCC lawyer: “Absolutely, your Honor. …If they filter the Internet and don’t provide access to all or substantially all endpoints, then…the rules don’t apply to them.”

As a result, the judges who upheld the Order said, “The Order…specifies that an ISP remains ‘free to offer ‘edited’ services’ without becoming subject to the rule’s requirements.”

Further, in the 1996 Telecom Act, Congress gave Internet access providers legal protection in order to encourage them to block lewd and “objectionable content.” Today, many ISPs offer family-friendly Internet access that blocks, say, pornographic and violent content. An FCC Order cannot and did not rewrite the Telecom Act and cannot require “access to all websites on the internet.”

“ISPs are required to provide equal access to all websites without giving any websites faster or slower download speeds”

Again, wrong. There is no “equal access to all websites” mandate (see above). Further, the 2015 Order allows ISPs to prioritize certain Internet traffic because preventing prioritization online would break Internet services.

This myth–that net neutrality rules require ISPs to be dumb pipes, treating all bits the same–has been circulated for years but is derided by networks experts. MIT computer scientist and early Internet developer David Clark colorfully dismissed this idea as “happy little bunny rabbit dreams.” He pointed out that prioritization has been built into Internet protocols for years and “[t]he network is not neutral and never has been.” 

Other experts, such as tech entrepreneur and investor Mark Cuban and President Obama’s former chief technology officer Aneesh Chopra, have observed that the need for Internet “fast lanes” as Internet services grow more diverse. Further, the nature of interconnection agreements and content delivery networks mean that some websites pay for and receive better service than others.

This is not to say the Order is toothless. It authorizes government price controls and invents a vague “general conduct standard” that gives the agency broad authority to reject, favor, and restrict new Internet services. The survey, however, declined to ask members of the public about the substance of the 2015 rules and instead asked about support for net neutrality slogans that have only a tenuous relationship with the actual rules.

“Net neutrality” has always been about giving the FCC, the US media regulator, vast authority to regulate the Internet. In doing so, the 2015 Order rejects the 20-year policy of the United States, codified in law, that the Internet and Internet services should be “unfettered by Federal or State regulation.” The US tech and telecom sector thrived before 2015 and the 2017 repeal of the 2015 rules will reinstate, fortunately, that light-touch regulatory regime.

Mobile broadband is a tough business in the US. There are four national carriers–Verizon, AT&T, T-Mobile, and Sprint–but since about 2011, mergers have been contemplated (and attempted, but blocked). Recently, the competition has gotten fiercer. The higher data buckets and unlimited data plans have been great for consumers.

The FCC’s latest mobile competition report, citing UBS data, says that industry ARPU (basically, monthly revenue per subscriber), which had been pretty stable since 1998, declined significantly from 2013 to 2016 from about $46 to about $36. These revenue pressures seemed to fall hardest on Sprint, who in February, issued $1.5 billion of “junk bonds” to help fund its network investments. Analysts pointed out in 2016 that “Sprint has not reported full-year net profits since 2006.” Further, mobile TV watching is becoming a bigger business. AT&T and Verizon both plan to offer a TV bundle to their wireless customers this year, and T-Mobile’s purchase of Layer3 indicates an interest in offering a mobile TV service.

It’s these trends that probably pushed T-Mobile and Sprint to announce yesterday their intention to merge. All eyes will be on the DOJ and the FCC as their competition divisions consider whether to approve the merger.

The Core Arguments

Merger opponents’ primary argument is what’s been raised several times since the 2011 AT&T-T-Mobile aborted merger: this “4 to 3” merger significantly raises the prospect of “tacit collusion.” After the merger, the story goes, the 3 remaining mobile carriers won’t work as hard to lower prices or improve services. While outright collusion on prices is illegal, they have a point that tacit collusion is more difficult for regulators to prove, to prevent, and to prosecute.

The counterargument, that T-Mobile and Sprint are already making, is that “mobile” is not a distinct market anymore–technologies and services are converging. Therefore, tacit collusion won’t be feasible because mobile broadband is increasingly competing with landline broadband providers (like Comcast and Charter), and possibly even media companies (like Netflix and Disney). Further, they claim, T-Mobile and Sprint going it alone will each struggle to deploy a capex-intensive 5G network that can compete with AT&T, Verizon, Comcast-NBCU, and the rest, but the merged company will be a formidable competitor in TV and in consumer and enterprise broadband.

Competitive Review

Any prediction about whether the deal will be approved or denied is premature. This is a horizontal merger in a highly-visible industry and it will receive an intense antitrust review. (Rachel Barkow and Peter Huber have an informative 2001 law journal article about telecom mergers at the DOJ and FCC.) The DOJ and FCC will seek years of emails and financial records from Sprint and T-Mobile executives and attempt to ascertain the “real” motivation for the merger and its likely consumer effects.

T-Mobile and Sprint will likely lean on evidence that consumers view (or soon will view) mobile broadband and TV as a substitute for landline broadband and TV. Much like phone and TV went from “local markets with one or two competitors” years ago to a “national market with several competitors,” their story seems to be, broadband is following a similar trajectory and viewing this as a 4 to 3 merger misreads industry trends.

There’s preliminary evidence that mobile broadband will put competitive pressure on conventional, landline broadband. Census surveys indicate that in 2013, 10% of Internet-using households were mobile Internet only (no landline Internet). By 2015, about 20% of households were mobile-only, and the proportion of Internet users who had landline broadband actually fell from 82% to 75%. But this is still preliminary and I haven’t seen economic evidence yet that mobile is putting pricing pressure on landline TV and broadband.

FCC Review

Antitrust review is only one step, however. The FCC transaction review process is typically longer and harder to predict. The FCC has concurrent authority with the DOJ under the Clayton Act to review telecommunications mergers under Sections 7 and 11 of the Clayton Act but it has never used that authority. Instead, the FCC uses its spectrum transfer review authority as a hook to evaluate mergers using the Communication Act’s (vague) “public interest standard.” Unlike antitrust standards, which generally put the burden on regulators to show consumer and competitive harm, the public interest standard as currently interpreted puts the burden on merging companies to show social and competitive benefits.

Hopefully the FCC will hew to a more rigorous antitrust inquiry and reform the open-ended public interest inquiry. As Chris Koopman and I wrote for the law journal a few years ago, these FCC  “public interest” reviews are sometimes excessively long and advocates use the vague standards to force the FCC into ancillary concerns, like TV programming decisions and “net neutrality” compliance.

Part of the public interest inquiry is a complex “spectrum screen” analysis. Basically, transacting companies can’t have too much “good” spectrum in a single regional market. I doubt the spectrum screen analysis would be dispositive (much of the analysis in the past seemed pretty ad hoc), but I do wonder if it will be an issue since this was a major issue raised in the AT&T-T-Mobile attempted merger.

In any case, that’s where I see the core issues, though we’ll learn much more as the merger reviews commence.

On March 19th, I had the chance to debate Franklin Foer at a Patrick Henry College event focused on the question, “Is Big Tech Big Brother?” It was billed as a debate over the role of technology in American society and whether government should be regulating media and technology platforms more generally.  [The full event video is here.] Foer is the author of the new book, World Without Mind: The Existential Threat of Big Tech, in which he advocates a fairly expansive regulatory regime for modern information technology platforms. He is open to building on regulatory ideas from the past, including broadcast-esque licensing regimes, “Fairness Doctrine”-like mandates for digital intermediaries, “fiduciary” responsibilities, beefed-up antitrust intervention, and other types of controls. In a review of the book for Reason, and then again during the debate at Patrick Henry University, I offered some reflections on what we can learn from history about how well ideas like those worked out in practice.

My closing statement of the debate, which lasted just a little over three minutes, offers a concise summation of what that history teaches us and why it would be so dangerous to repeat the mistakes of the past by wandering down that disastrous path again. That 3-minute clip is posted below. (The audience was polled before and after the event and asked the same question each time: “Do large tech companies wield too much power in our economy, media and personal lives and if so, should government(s) intervene?” Apparently at the beginning, the poll was roughly Yes – 70% and No – 30%, but after the debated ended it has reversed, with only 30% in favor of intervention and 70% against. Glad to turn around some minds on this one!)

via ytCropper

Image result for Zuckerberg Schmidt laughing

Two weeks ago, as Facebook CEO Mark Zuckerberg was getting grilled by Congress during a two-day media circus set of hearings, I wrote a counterintuitive essay about how it could end up being Facebook’s greatest moment. How could that be? As I argued in the piece, with an avalanche of new rules looming, “Facebook is potentially poised to score its greatest victory ever as it begins the transition to regulated monopoly status, solidifying its market power, and limiting threats from new rivals.”

With the exception of probably only Google, no firm other than Facebook likely has enough lawyers, lobbyists, and money to deal with layers of red tape and corresponding regulatory compliance headaches that lie ahead. That’s true both here and especially abroad in Europe, which continues to pile on new privacy and “data protection” regulations. While such rules come wrapped in the very best of intentions, there’s just no getting around the fact that regulation has costs. In this case, the unintended consequence of well-intentioned data privacy rules is that the emerging regulatory regime will likely discourage (or potentially even destroy) the chances of getting the new types of innovation and competition that we so desperately need right now.

Others now appear to be coming around to this view. On April 23, both the New York Times and The Wall Street Journal ran feature articles with remarkably similar titles and themes. The New York Times article by Daisuke Wakabayashi and Adam Satariano was titled, “How Looming Privacy Regulations May Strengthen Facebook and Google,” and The Wall Street Journal’s piece, “Google and Facebook Likely to Benefit From Europe’s Privacy Crackdown,” was penned by Sam Schechner and Nick Kostov.

“In Europe and the United States, the conventional wisdom is that regulation is needed to force Silicon Valley’s digital giants to respect people’s online privacy. But new rules may instead serve to strengthen Facebook’s and Google’s hegemony and extend their lead on the internet,” note Wakabayashi and Satariano in the NYT essay. They continue on to note how “past attempts at privacy regulation have done little to mitigate the power of tech firms.” This includes regulations like Europe’s “right to be forgotten” requirement, which has essentially put Google in a privileged position as the “chief arbiter of what information is kept online in Europe.”
Continue reading →

The recently enacted Stop Enabling Sex Trafficking Act (SESTA) has many problems including that it doesn’t achieve its stated purpose of stopping sex trafficking. It contains a retroactivity clause that appears facially unconstitutional, but this provision would likely be severable by courts if used as the sole basis of a legal challenge. Perhaps more concerning are the potential First Amendment violations of the law.

These concerns go far beyond the rights of websites as speakers, but to the individual users’ content generation. Promoting sex trafficking is already a crime and a lawful restraint on speech. Websites, however, have acted broadly and quickly due to concerns of their new liability under the law and as a result lawful speech has also been stifled.

Given the controversial nature of the law it seems likely that a legal challenge is forthcoming. Here are three ideas about what a First Amendment challenge to the law might look like.

Continue reading →

On Monday, April 16th, the Technology Policy Institute hosted an event on “Facebook & Cambridge Analytica: Regulatory & Policy Implications.” I was invited to deliver some remarks on a panel that included Howard Beales of George Washington University, Stuart Ingis of Venable LLP, Josephine Wolff of the Rochester Institute of Technology, and Thomas Lenard of TPI, who moderated. I offered some thoughts about the potential trade-offs associated with treating Facebook like a regulated public utility. I wrote an essay here last week on that topic. My remarks at the event begin at the 13:45 mark of the video.

 

Expanding rural broadband has generated significant interest in recent years. However, the current subsidy programs are often mismanaged and impose little accountability. It’s not clear what effect rural broadband subsidies have had, despite the amount of money spent on it. As economist Scott Wallsten has pointed out, the US government has spent around $100 billion on rural telecommunications and broadband since 1995 “without evidence that it has improved adoption.”

So I was pleased to hear a few months ago that the Montana Public Service Commission was making an inquiry into how to improve rural broadband subsidy programs. Montana looms large in rural broadband discussions because Montana telecommunications providers face some of the most challenging terrain the US–mountainous, vast, and lightly-populated. (In fact, “no bars on your phone” in rural Montana is a major plot element in the popular videogame Far Cry 5. HT Rob Jackson.)

I submitted comments in the Montana PSC proceeding and received an invitation to testify at a hearing on the subject. So last week I flew to Helena to discuss rural broadband programs with the PSC and panelists. I emphasized three points.

  • Federal broadband subsidy programs are facing higher costs and fewer beneficiaries.

Using FCC data, I calculated that since 1998, USF high-cost subsidies to Montana telecom companies have risen by about 40% while the number of rural customers served by those companies have decreased by over 50%. I suspect these trends are common nationally, and that USF subsidies are increasing while fewer people are benefiting.

  • Wireless broadband is the future, especially in rural areas.

“Fiber everywhere” is not a wise use of taxpayer funds and exurban and rural households are increasingly relying on wireless–from satellite, WISPs, and mobile. In 2016, the CDC reported that more households had wireless phone than landline phone service. You’re starting to see “cord cutting” pick up for broadband as well. Census surveys indicate that in 2013, 10% of Internet-using households were mobile Internet only (no landline Internet). By 2015, that percentage had doubled, and about 20% of households were mobile-only. The percentage is likely even higher today now that unlimited data plans are common. Someday soon the FCC will have to conclude that mobile broadband is a substitute for fixed broadband, and subsidy programs should reflect that.

  • Consumer-focused “tech vouchers” would be a huge improvement over current broadband programs.

Current programs subsidize the construction of networks even where there’s no demand. The main reason the vast majority of non-Internet users don’t subscribe to broadband is that they are uninterested in subscribing, according to surveys from the NTIA (55% are uninterested), Pew (70% are uninterested), and FCC and Connected Nation experts (63% are uninterested). With rising costs and diminishing returns to rural fiber construction, the FCC needs to reevaluate USF and make subsidies more consumer-focused. The UK for a couple years has pursued another model for rural broadband: consumer broadband vouchers. Since most people who don’t subscribe to broadband don’t want it, vouchers protect taxpayers from unnecessary expense and paying for gold-plated services.

For years, economists and the GAO have criticized the structure, complexity, and inefficiency of the USF programs, and particularly the rural program. The FCC is constantly changing the programs because of real and perceived deficiencies, but this has made the USF unwieldy. Montana providers participate in at least seven different rural USF programs alone (that doesn’t include the other USF programs and subprograms or other federal help, like RUS grants).

Unfortunately, most analysis and reporting on US broadband programs can be summed up as “don’t touch the existing programs–just send more money.” (There are some exceptions and scrutiny of the programs, like Tony Romm’s 2015 Politico investigation into the mismanagement of stimulus-funded Ag Department broadband projects.)

“Journalism as advocacy” is unfortunately the norm when it comes to broadband policy. Take, for instance, this article about the digital divide that omits mention of the $100 billion spent in rural areas alone, only to conclude that “small [broadband] companies and cooperatives are going it more or less alone, without much help yet from the federal government.”

(That story and another digital divide story had other problems, namely, a reliance on an academic study using faulty data purchased from a partisan campaign firm. FiveThirtyEight deserves credit for acknowledging the data’s flaws but that should have alerted the editors on the need for still more fact-checking.) 

States can’t rewrite federal statutes and regulations but it’s to the Montana PSC’s great credit that they sensed that all is not well. Current trends will only put more stress on the programs. Hopefully other state PUCs will see that the current programs do a disservice for universal service objectives and consumers.

Last Friday, law enforcement agencies shutdown Backpage.com. The website has become infamous for its role in sex trafficking, particularly related to underage victims, and its shutdown is rightly being applaud by many as a significant win for preventing sex trafficking online. This shutdown shows, however, that prosecutors had the tools necessary to go after bad actors prior to the passage of the Stop Enabling Sex Traffickers Act (SESTA) last month. Unfortunately, this is not the first time the government has pushed for regulation of technology knowing it already had the tools and information needed to build a case against bad actors.

The version of SESTA passed by Congress last month included a number of poorly thought through components including an ex post facto application and poorly articulated definitions, but it passed both houses of Congress with little opposition. In fact, because the law was seen as a must pass and linked to sex trafficking, the Senate even overwhelming rejected an amendment to provide additional funding for prosecuting such crimes. Even without being signed into law, SESTA has already resulted in Reddit and Craigslist removing communities from their platforms within days of its passage. What this most recent event shows is the government already had the tools to go after the bad actors like Backpage, but failed to use them as Congress debated and passed a law that chipped away at the protection for the rest of the Internet and gave the government even broader powers.

This is not the first time that the government has encouraged through either its action or inaction damaging regulation of disruptive technology while knowing that it had tools at its disposal that could achieve the desired results without the need for an additional regulatory burden. In 2016, the government argued following the San Bernadino shootings that it need more access to encrypted devices like the iPhone when Apple refused to comply with a writ compelling it to unlock the shooters’ phones. The Senate responded to the controversy by proposing a bill that would require business like Apple to assist authorities in gaining access to encrypted devices. Thankfully, because the FBI was able to gain the information needed without Apple through a third party vendor, such calls largely diminished and the legislation never went anywhere.  Now, a recent Office of the Inspector General report has revealed the FBI “testified inaccurately or made false statements” regarding its ability to gain data from the encrypted iPhone.

Continue reading →

With Facebook CEO Mark Zuckerberg in town this week for a political flogging, you might think that this is darkest hour for the social networking giant. Facebook stands at a regulatory crossroads, to be sure. But allow me to offer a cynical take, and one based on history: Facebook is potentially poised to score its greatest victory ever as it begins the transition to regulated monopoly status, solidifying its market power, and limiting threats from new rivals.

By slowly capitulating to critics (both here and abroad) who are thirsty for massive regulation of the data-driven economy, Facebook is setting itself up as a servant of the state. In the name of satisfying some amorphous political “public interest” standard and fulfilling a variety of corporate responsibility objectives, Facebook will gradually allow itself to be converted into a sort of digital public utility or electronic essential facility.

That sounds like trouble for the firm until you realize that Facebook is one of the few companies who will be able to sacrifice a pound of flesh like that and remain alive. As layers of new regulatory obligations are applied, barriers to new innovations will become formidable obstacles to the very competitors that the public so desperately needs right now to offer us better alternatives. Gradually, Facebook will recognize this and go along with the regulatory schemes. And then eventually they will become the biggest defender of all of it.

Welcome to Facebook’s broadcast industry moment. The firm is essentially in the same position the broadcast sector was about a century ago when it started cozying up to federal lawmakers. Over time, broadcasters would warmly embrace an expansive licensing regime that would allow all parties—regulatory advocates, academics, lawmakers, bureaucrats, and even the broadcasters themselves—to play out the fairy tale that broadcasters would be good “public stewards” of the “public airwaves” to serve the “public interest.”

Alas, the actual listening and viewing public got royally shafted in this deal. Continue reading →

SESTA passed the Senate last week after having previously passed the House. President Trump is expected to sign it into law despite the opposition to this version of the bill from the Department of Justice. As I have previously written about, there are a great deal of concerns about how the bill may actually make it harder to address online sex trafficking and more generally impact innovation on the Internet.

The reality is that we are looking at a post-SESTA world without the full protection of Section 230 and that reality will likely end up far from the best case scenario, but hopefully not fully at the worst. Intermediaries, however, do not have the luxury to wait around and see how the law actually plays out, especially given its retroactive provision. As a result, Reddit has already deleted a variety of sub-reddits and Craigslist has closed its entire personals section. One can only imagine the difficult decisions facing the creators of dating apps or messaging services.

So what can we expect to happen now…

Continue reading →