Articles by Brent Skorup

Brent SkorupBrent is a senior research fellow with the Technology Policy Program at the Mercatus Center at GMU. He has an economics degree from Wheaton College and a law degree from George Mason University. Opinions are his own.


I saw a Bloomberg News report that officials in Austria and Italy are seeking (aggregated, anonymized) users’ location data from cellphone companies to see if local and national lockdowns are effective.

It’s an interesting idea that raises some possibilities for US officials and tech companies to consider to combat the crisis in the US. Caveat: these are very preliminary thoughts.

Cellphone location data from a phone company is OK but imprecise about your movements. It can show where you are typically in a mile or half-mile area. 

But smartphone app location is much more precise since it uses GPS, not cell towers to show movements. Apps with location services can show people’s movements within meters, not half-mile, like cell towers.I suspect 90%+ of smartphone users have GPS location services on (Google Maps, Facebook, Yelp, etc.). App companies have rich datasets of daily movements of people.

Step 1 – App companies isolate and share location trends with health officials

This would need to be aggregated and anonymized of course. Tech companies with health officials should, as Balaji Srinivasan says, identify red and green zones. The point is not to identify individuals but make generalizations about whether a neighborhood or town is practicing good distancing practices.

Step 2 – In green zones, where infection/hospitalization are low and app data says people are strictly distancing, COVID-19 tests.

If people are spending 22 hours not moving except for brief visits to the grocery store and parks, that’s a good neighborhood. We need tests distributed daily in non-infected areas, perhaps at grocery stores and via USPS and Amazon deliveries. As soon as the tests production ramps up, tests need to flood into the areas that are healthy. This achieves two things:

  • Asymptomatic people who might spread can stay home.
  • Non-infected people can start returning to work and a life of semi-normalcy of movement with confidence that others who are out are non-contagious.

Step 3 – In red zones, where infection/hospitalization is high and people aren’t strictly distancing, public education and restrictions.

At least in Virginia, there is county-level data about where the hotspots are. I expect other states know the counties and neighborhoods that are hit hard. Where there’s overlap of these areas not distancing, step up distancing and restrictions.

That still leaves open what to do about yellow zones that are adjacent to red zones, but the main priority should be to identify the green and red. The longer health officials and the public are flying blind with no end in sight, people get frustrated, lose jobs, shutter businesses, and violate distancing rules.

To help slow the spread of the coronavirus, the GMU campus is moving to remote instruction and Mercatus is moving to remote work for employees until the risk subsides. GMU and Mercatus employees join thousands of other universities and businesses this week. Millions of people will be working from home and it will be a major test of American broadband and cellular networks. 

There will likely be a loss of productivity nationwide–some things just can’t be done well remotely. But hopefully broadband access is not a major issue. What is the state of US networks? How many people lack the ability to do remote work and remote homework?

The FCC and Pew research keep pretty good track of broadband buildout and adoption. There are many bright spots but some areas of concern as well.

Who lacks service?

The top question: How many people want broadband but lack adequate service or have no service?

The good news is that around 94% of Americans have access to 25 Mbps landline broadband. (Millions more have access if you include broadband from cellular and WISP providers.) It’s not much consolation to rural customers and remote workers who have limited or no options, but these are good numbers.

According to Pew’s 2019 report, about 2% of Americans cite inadequate or no options as the main reason they don’t have broadband. What is concerning is that this 2% number hasn’t budged in years. In 2015, about the same number of Americans cited inadequate or no options as the main reason they didn’t have home broadband. This resembles what I’ve called “the 2% problem“–about 2% of the most rural American households are extremely costly to serve with landline broadband. Satellite, cellular, or WISP service will likely be the best option.

Mobile broadband trends

Mobile broadband is increasingly an option for home broadband. About 24% of Americans with home Internet are mobile only, according to Pew, up from ~16% in 2015.

The ubiquity of high-speed mobile broadband has been the big story in recent years. Per FCC data, from 2009 to 2017 (the most recent year we have data), the average number of new mobile connections increased about 30 million annually. In Dec. 2017 (the most recent data), there were about 313 million mobile subscriptions.

Coverage is very good in the US. OpenSignal uses crowdsourced data and software to determine how frequently users’ phones have a 4G LTE network available (a proxy for coverage and network quality) around the world. The US ranked fourth the world (86%) in 2017, beating out every European country, save Norway.

There was also a big improvement was in mobile speeds. In 2009, a 3G world, almost all connections were below 3 Mbps. In 2017, a world of 4G LTE, almost all connections were above 3 Mbps.

Landline broadband trends

Landline broadband also increased significantly. From 2009 to 2017, there were about 3.5 million new connections per year, about 108 million connections in 2017. In Dec. 2009, about half of landline connections were below 3 Mbps.

There were some notable jumps in high-speed and rural broadband deployment. There was a big jump in fiber-to-the-premises (FTTP) connections, like FiOS and Google Fiber. From 2012 to 2017, the number of FTTP connections more than doubled, to 12.6 million. Relatedly, sub-25 Mbps connections have been falling rapidly while 100 Mbps+ connections have been shooting up. In 2017, there were more connections with 100 Mbps+ (39 million) than there were connections below 25 Mbps (29 million).

In the most recent 5 years for which we have data, the number of rural subscribers (not households) with 25 Mbps increased 18 million (from 29 million to 47 million).

More Work

We only have good data for the first year of the Trump FCC, so it’s hard to evaluate but signs are promising. One of Chairman Pai’s first actions was creating an advisory committee to advise the FCC on broadband deployment (I’m a member). Anecdotally, it’s been fruitful to regularly have industry, academics, advocates, and local officials in the same room to discuss consensus policies. The FCC has acted on many of those.

The rollback of common carrier regulations for the Internet, the pro-5G deployment initiatives, and limiting unreasonable local fees for cellular equipment have all helped increase deployment and service quality.

An effective communications regulator largely stays of the way and removes hindrances to private sector investment. But the FCC does manage some broadband subsidy programs. The Trump FCC has made some improvements to the $4.5 billion annual rural broadband programs. The 17 or so rural broadband subprograms have metastasized over the years, making for a kludgey and expensive subsidy system.

The recent RDOF reforms are a big improvement since they fund a reverse auction program to shift money away from the wasteful legacy subsidy programs. Increasingly, rural households get broadband from WISP, satellite, and rural cable companies–the RDOF reforms recognize that reality.

Hopefully one day reforms will go even further and fund broadband vouchers. It’s been longstanding FCC policy to fund rural broadband providers (typically phone companies serving rural areas) rather than subsidizing rural households. The FCC should consider a voucher model for rural broadband, $5 or $10 or $40 per household per month, depending on the geography. Essentially the FCC should do for rural households what the FCC does for low-income households–provide a monthly subsidy to make broadband costs more affordable.

Many of these good deployment trends began in the Obama years but the Trump FCC has made it a national priority to improve broadband deployment and services. It appears to be be working. With the coronavirus and a huge increase in remote work, US networks will be put to a unique test.

Michael Kotrous and I submitted a comment to the FAA about their Remote ID proposals. While we agree with the need for a “digital license plate” for drones, we’re skeptical that requiring an Internet connection is necessary and that an interoperable, national drone traffic management system will work well.

The FAA deserves credit for rigorously estimating the costs of their requirements, which they set at around $450 million to $600 million over 10 years. These costs largely fall on drone operators and on drone manufacturers for network (say, LTE) subscriptions and equipment.

The FAA’s proposed requirements aren’t completely hashed out, but we raised two points of caution.

One, many many drone flights won’t stray from a pre-programmed route or leave private property. For instance, roof inspections, medical supply deliveries across a hospital campus, train track inspections, and crop spraying via drone all remain on private property. They all pose a de minimis safety concern to manned aircraft and requiring networking equipment and subscriptions seems excessive.

Two, we’re not keen on the FAA and NASA plans for an interoperable, national drone traffic management system. A simple wireless broadcast from a drone should be enough in most circumstances. The FAA proposal would require drone operators to contract with UAS Service Suppliers (USSs) who would be contractors of the FAA. Technical standards would come later. This convoluted system of making virtually all drone operations known to the FAA is likely run aground with technical complexity, technical stagnation, FAA-blessed oligopoly in USS or all of the above.

The FAA instead should consider allowing states, cities, and landowners to make rules for drone operations when operations are solely on their property. States are ready to step in. The North Dakota legislature, for instance, authorized $28 million a few months ago for a statewide drone management system. Other states will follow suit and a federated, geographically-separated drone management system could develop, if the FAA allows. That would reduce the need for complex, interoperable USS and national drone traffic management systems.

Further reading:

Refine the FAA’s Remote ID Rules to Ensure Aviation Safety and Public Confidence, comment to the FAA (March 2020), https://www.mercatus.org/publications/technology-and-innovation/refine-faa%E2%80%99s-remote-id-rules-ensure-aviation-safety-and

Auctioning Airspace, North Carolina Journal of Law & Technology (October 2019), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3284704

Last week I attended the Section 230 cage match workshop at the DOJ. It was a packed house, likely because AG Bill Barr gave opening remarks. It was fortuitous timing for me: my article with Jennifer Huddleston, The Erosion of Publisher Liability in American Law, Section 230, and the Future of Online Curation, was published 24 hours before the workshop by the Oklahoma Law Review.

These were my impressions of the event:

I thought it was pretty well balanced event and surprisingly civil for such a contentious topic. There were strong Section 230 defenders and strong Section 230 critics, and several who fell in between. There were a couple cheers after a few pointed statements from panelists, but the audience didn’t seem to fall on one side or the other. I’ll add that my friend and co-blogger Neil Chilson gave an impressive presentation about how Section 230 helped make the “long tail” of beneficial Internet-based communities possible.

AG Bob Barr gave the opening remarks, which are available online. A few things jumped out. He suggested that Section 230 had its place but Internet companies are not an infant industry anymore. In his view, the courts have expanded Section 230 beyond drafters’ intent, and the Reno decision “unbalanced” the protections, which were intended to protect minors. The gist of his statement was that the law needs to be “recalibrated.”

Each of these points were disputed by one or more panelists, but the message to the Internet industry was clear: the USDOJ is scrutinizing industry concentration and its relationship to illegal and antisocial online content.

The workshop signals that there is now a large, bipartisan coalition that would like to see Section 230 “recalibrated.” The problem for this coalition is that they don’t agree on what types of content providers should be liable for and they are often at cross-purposes. The problematic content ranges from sex trafficking, to stalkers, to opiate trafficking, to revenge porn, to unfair political ads. For conservatives, social media companies take down too much content, intentionally helping progressives. For progressives, social media companies leave up too much content, unwittingly helping conservatives.

I’ve yet to hear a convincing way to modify Section 230 that (a) satisfies this shaky coalition, (b) would be practical to comply with, and (c) would be constitutional.

Now, Section 230 critics are right: the law blurs the line between publisher and conduit. But this is not unique to Internet companies. The fact is, courts (and federal agencies) blurred the publisher-conduit dichotomy for fifty years for mass media distributors and common carriers as technology and social norms changed. Some cases that illustrate the phenomenon:

In Auvil v. CBS 60 Minutes, a 1991 federal district court decision, some Washington apple growers sued some local CBS affiliates for airing allegedly defamatory programming. The federal district court dismissed the case on the grounds that the affiliates are conduits of CBS programming. Critically, the court recognized that the CBS affiliates “had the power to” exercise editorial control over the broadcast and “in fact occasionally [did] censor programming . . . for one reason or another.” Still, case dismissed. The principle has been cited by other courts. Publishers can be conduits.

Conduits can also be publishers. In 1989, Congress passed a law requiring phone providers to restrict “dial-a-porn” services to minors. Dial-a-porn companies sued. In Information Providers Coalition v. FCC, the 9th Circuit Court of Appeals held that regulated common carriers are “free under the Constitution to terminate service” to providers of indecent content. The Court relied on its decision a few years earlier in Carlin Communications noting that when a common carrier phone company is connecting thousands of subscribers simultaneously to the same content, the “phone company resembles less a common carrier than it does a small radio station.”

Many Section 230 reformers believe Section 230 mangled the common law would like to see the restoration of the publisher-conduit dichotomy. As our research shows, that dichotomy had already been blurred for decades. Until advocates and lawmakers acknowledge these legal trends and plan accordingly, the reformers risk throwing out the baby with the bathwater.

Relevant research:
Brent Skorup & Jennifer Huddleston, The Erosion of Publisher Liability in American Law, Section 230, and the Future of Online Curation (Oklahoma Law Review).

Brent Skorup & Joe Kane, The FCC and Quasi–Common Carriage: A Case Study of Agency Survival (Minnesota Journal of Law, Science & Technology).

Technopanics, Progress Studies, AI, spectrum, and privacy were hot topics at the Technology Liberation Front in the past year. Below are the most popular posts from 2019.

Glancing at our site metrics over the past 10 years, the biggest topics in the 2010s were technopanics, Bitcoin, net neutrality, the sharing economy, and broadband policy. Looking forward at the 2020s, I’ll hazard some predictions about what will be significant debates at the TLF: technopanics and antitrust, AVs, drones, and the future of work. I suspect that technology and federalism will be long-running issues in the next decade, particularly for drones, privacy, AVs, antitrust, and healthcare tech.

Enjoy 2019’s top 10, and Happy New Year.

10. 50 Years of Video Games & Moral Panics by Adam Thierer

I have a confession: I’m 50 years old and still completely in love with video games.

As a child of the 1970s, I straddled the divide between the old and new worlds of gaming. I was (and remain) obsessed with board and card games, which my family played avidly. But then Atari’s home version of “Pong” landed in 1976. The console had rudimentary graphics and controls, and just one game to play, but it was a revelation. After my uncle bought Pong for my cousins, our families and neighbors would gather round his tiny 20-inch television to watch two electronic paddles and a little dot move around the screen.

9. The Limits of AI in Predicting Human Action by Anne Hobson and Walter Stover

Let’s assume for a second that AIs could possess not only all relevant information about an individual, but also that individual’s knowledge. Even if companies somehow could gather this knowledge, it would only be a snapshot at a moment in time. Infinite converging factors can affect one’s next decision to not purchase a soda, even if your past purchase history suggests you will. Maybe you went to the store that day with a stomach ache. Maybe your doctor just warned you about the perils of high fructose corn syrup so you forgo your purchase. Maybe an AI-driven price raise causes you to react by finding an alternative seller.

In other words, when you interact with the market—for instance, going to the store to buy groceries—you are participating in a discovery process about your own preferences or willingness to pay.

8. Free-market spectrum policy and the C Band by Brent Skorup

A few years ago I would have definitely favored speed and the secondary market plan. I still lean towards that approach but I’m a little more on the fence after reading Richard Epstein’s work and others’ about the “public trust doctrine.” This is a traditional governance principle that requires public actors to receive fair value when disposing of public property. It prevents public institutions from giving discounted public property to friends and cronies. Clearly, cronyism isn’t the case here and FCC can’t undo what FCCs did generations ago in giving away spectrum. I think the need for speedy deployment trumps the windfall issue here, but it’s a closer call for me than in the past.

One proposal that hasn’t been contemplated with the C Band but might have merit is an overlay auction with a deadline. With such an auction, the FCC gives incumbent users a deadline to vacate a band (say, 5 years). The FCC then auctions flexible-use licenses in the band. The FCC receives the auction revenues and the winning bidders are allowed to deploy services immediately in the “white spaces” unoccupied by the incumbents. The winning bidders are allowed to pay the incumbents to move out before the deadline.

7. STELAR Expiration Warranted by Hance Haney

The retransmission fees were purposely set low to help the emerging satellite carriers get established in the marketplace when innovation in satellite technology still had a long way to go. Today the carriers are thriving business enterprises, and there is no need for them to continue receiving subsidies. Broadcasters, on the other hand, face unprecedented competition for advertising revenue that historically covered the entire cost of content production.

Today a broadcaster receives 28 cents per subscriber per month when a satellite carrier retransmits their local television signal. But the fair market value of that signal is actually $2.50, according to one estimate.

6. What is Progress Studies? by Adam Thierer

How do we shift cultural and political attitudes about innovation and progress in a more positive direction? Collison and Cowen explicitly state that the goal of Progress Studies transcends “mere comprehension” in that it should also look to “identify effective progress-increasing interventions and the extent to which they are adopted by universities, funding agencies, philanthropists, entrepreneurs, policy makers, and other institutions.”

But fostering social and political attitudes conducive to innovation is really more art than science. Specifically, it is the art of persuasion. Science can help us amass the facts proving the importance of innovation and progress to human improvement. Communicating those facts and ensuring that they infuse culture, institutions, and public policy is more challenging.

5. How Do You Value Data? A Reply To Jaron Lanier’s Op-Ed In The NYT by Will Rinehart

All of this is to say that there is no one single way to estimate the value of data.

As for the Lanier piece, here are some other things to consider:

A market for data already exists. It just doesn’t include a set of participants that Jaron wants to include, which are platform users.    

Will users want to be data entrepreneurs, looking for the best value for their data? Probably not. At best, they will hire an intermediary to do this, which is basically the job of the platforms already.

An underlying assumption is that the value of data is greater than the value advertisers are willing to pay for a slice of your attention. I’m not sure I agree with that.

Finally, how exactly do you write these kinds of laws?

4. Explaining the California Privacy Rights and Enforcement Act of 2020 by Ian Adams

As released, the initiative is equal parts privacy extremism and cynical-politics. Substantively, some will find elements to applaud in the CPREA, between prohibitions on the use of behavioral advertising and reputational risk assessment (all of which are deserving of their own critiques), but the operational structure of the CPREA is nothing short of disastrous. Here are some of the worst bits:

3. Best Practices for Public Policy Analysts by Adam Thierer

So, for whatever it’s worth, here are a few ideas about how to improve your content and your own brand as a public policy analyst. The first list is just some general tips I’ve learned from others after 25 years in the world of public policy. Following that, I have also included a separate set of notes I use for presentations focused specifically on how to prepare effective editorials and legislative testimony. There are many common recommendations on both lists, but I thought I would just post them both here together.

2. An Epic Moral Panic Over Social Media by Adam Thierer

Strangely, many elites, politicians, and parents forget that they, too, were once kids and that their generation was probably also considered hopelessly lost in the “vast wasteland” of whatever the popular technology or content of the day was. The Pessimists Archive podcast has documented dozens of examples of this reoccurring phenomenon. Each generation makes it through the panic du jour, only to turn around and start lambasting newer media or technologies that they worry might be rotting their kids to the core. While these panics come and go, the real danger is that they sometimes result in concrete policy actions that censor content or eliminate choices that the public enjoys. Such regulatory actions can also discourage the emergence of new choices.

1. How Conservatives Came to Favor the Fairness Doctrine & Net Neutrality by Adam Thierer

If I divided my time in Tech Policy Land into two big chunks of time, I’d say the biggest tech-related policy issue for conservatives during the first 15 years I was in the business (roughly 1990 – 2005) was preventing the resurrection of the so-called Fairness Doctrine. And the biggest issue during the second 15-year period (roughly 2005 – present) was stopping the imposition of “Net neutrality” mandates on the Internet. In both cases, conservatives vociferously blasted the notion that unelected government bureaucrats should sit in judgment of what constituted “fairness” in media or “neutrality” online.

Many conservatives are suddenly changing their tune, however.

After coming across some reviews of Thomas Philippon’s book, The Great Reversal: How America Gave Up on Free Markets, I decided to get my hands on a copy. Most of the reviews and coverage mention the increasing monopoly power of US telecom companies and rising prices relative to European companies. In fact, Philippon tells readers in the intro of the book that the question that spurred him to write Great Reversal is “Why on earth are US cell phone plans so expensive?”

As someone who follows the US mobile market closely, I was a little disappointed that the analysis of the telecom sectors is rather slim. There’s only a handful of pages (out of 340) of Europe-US telecom comparison, featuring one story about French intervention and one chart. This isn’t a criticism of the book–Philippon doesn’t pitch it as a telecom policy book. However, the telecom section in the book isn’t the clear policy success story it’s described as.

The general narrative in the book is that US lawmakers are entranced by the laissez-faire Chicago school of antitrust and placated by dark money campaigns. The result, as Philippon puts it, is that “Creeping monopoly power has slowly but surely suffocated the [US] middle class” and today Europe has freer markets than the US. That may be, but the telecom sectors don’t provide much support for that idea.

Low Prices in European Telecom . . .

Philippon says that “The telecommunications industry provides another example of successful competition policy in Europe.”

He continues:

The case of France provides a striking example of competition. Free Mobile . . . obtained its 4G license [with regulator assistance] in 2011 and became a significant competitor for the three large incumbents. The impact was immediate. . . . In about six months after the entry of Free Mobile, the price paid by French consumers had dropped by about 40 percent. Wireless services in France had been more expensive in the US, but now they are much cheaper.

It’s true, mobile prices are generally lower in Europe. Monthly average revenue per user (ARPU) in the US, for instance, is about double the ARPU in the UK (~$42 v. ~$20 in 2016). And, as Philippon points out, cellular prices are lower in France as well.

One issue with this competition “success story”: the US also has four mobile carriers, and had four mobile carriers even prior to 2011. Since the number of competitors is the same in France and the US, competition doesn’t really explain why there’s a price difference between France and the US. (India, for instance, has fewer providers than the US and France–and much lower cellular prices, so number of competitors isn’t a great predictor of pricing.)

. . . and Low Investment

If “lower telecom prices than the US” is the standard, then yes, European competition policy has succeeded. But if consumers and regulators prioritize other things, like industry investment, network quality (fast speeds), and rural coverage, the story is much more mixed. (Bret Swanson at AEI points to other issues with Philippon’s analysis.) Philippon’s singular focus on telecom prices and number of competitors distracts from these other important competition and policy dimensions.

According to OECD data, for instance, in 2015 the US exceeded the OECD average for spending on IT and communications equipment as a percent of GDP. France might have lower cell phone bills, but US telecom companies spend 275% more than French telecom companies on this measure (1.1% of GDP v. 0.4% of GDP) .

Further, telecom investment per capita in the US was much higher than its European counterparts. US telecom companies spent about 55 percent more per capita than French telecoms spent ($272 v. $175), according to the same OECD reports. And France is one of the better European performers. Many European carriers spend, on a per capita basis, less than half what US carriers spend. US carriers spend 130% more than UK telecoms spend and 145% more than German telecoms.

This investment deficit in Europe has real-world effects on consumers. OpenSignal uses crowdsourced data and software to determine how frequently users phones have a 4G LTE network available (a proxy for coverage and network quality) around the world. The US ranked fourth the world (86%) in 2017, beating out every European country, save Norway. In contrast, France and Germany ranked 60th and 61st, respectively, for this network quality measure, beat out by less wealthy nations like Kazakhstan, Cambodia, and Romania. 

The European telecom regulations and anti-merger policies created a fragmented market and financially strapped companies. As a result, investors are fleeing European telecom firms. According to the Financial Times and Bloomberg data, between 2012 and 2018, the value of Europe’s telecom companies fell almost 50%. The value of the US sector rose by 70% and the Asian sector rose by 13% in that time period.  

Price Wars or 5G Investment?

Philippon is right that Europe has chosen a different path than the US when it comes to telecom services. Whether they’ve chosen a pro-consumer path depends on where you sit (and live). Understandably, academics and advocates living in places like Boston, New York and DC look fondly at Berlin and Paris broadband prices. Network quality outside of the cities and suburbs rarely enters the picture in these policy discussions, and Philippon’s book is no exception. US lawmakers and telecom companies have prioritized non-price dimensions: network quality, investment in 5G, and rural coverage.

If anything, European regulators seem to be retreating somewhat from the current path of creating competitors and regulating prices. As the Financial Times wrote last year, the trend in Europe telecom is consolidation. The French regulator ARCEP reversed course last year signaled a new openness to telecom consolidation.

Still, there are significant obstacles to consolidation in European markets, and it seems likely they’ll fall further behind the US and China in rural network coverage and 5G investment. European telecom companies are in a bit of panic about this, which they expressed in a letter to the European Commission this month, urging reform.

In short, European telecom competition policy is not the unqualified success depicted in Great Reversal. To his credit, Philippon in the book intro emphasizes humility about prognostications and the limits of experts’ knowledge:

I readily admit I don’t have all the answers. …I would suggest . . . that [economists’] prescriptions be taken with a (large) grain of salt. When you read an author or commentator who tells you something obvious, take your time and do the math. Almost every time, you’ll discover that it wasn’t really obvious at all. I have found that people who tell you that the answers to the big questions in economics are obvious are telling you only half of the story.

Couldn’t have put it better myself.

Credit to Connor Haaland for research assistance.

A few weeks ago I was invited to provide testimony about rural broadband policy to the Communications and Technology Committee in the Pennsylvania Senate (video recording of the hearing). My co-panelists were Kathyrn de Wit from Pew and Prof. Sasha Meinrath from Penn State University.

In preparing for the testimony I was surprised to learn how much money leaves Pennsylvania annually to fund the federal Universal Service Fund programs. In recent years, a net $200 million leaves the state annually and is disbursed at USAC and in other states. That’s a lot of money considering Pennsylvania, like many geographically large states, has its own broadband deployment problems.

From the Intro:

The federal government has spent more than $100 billion on rural telecommunications in the past 20 years. Most of that total comes from the federal Universal Service Fund (USF), which disburses about $4.5 billion annually to rural providers across the country. In addition, the Pennsylvania Universal Service Fund redistributes about $32 million annually from Pennsylvania phone customers to Pennsylvania phone companies serving rural areas.

Are rural residents seeing commensurate benefits trickle down to them? That seems doubtful. These programs are complex and disburse subsidies in puzzling and uneven ways. Reform of rural telecommunications programs is urgently needed. FCC data suggest that the current USF structure disproportionately penalizes Pennsylvanians—a net $800 million left the state from 2013 to 2017.

I made a few recommendations, which mostly apply for state legislators in other states looking at rural broadband issues.

I also came across an interesting program in Pennsylvania spearheaded in 2018 by Gov. Wolf. It’s a $35 million grant program to rural providers. From the Governor’s website:

The program was a partnership between the Office of Broadband Initiatives and PennDOT. The $35 million of incentive funding was provided through PennDOT to fulfill its strategic goal of supporting intelligent transportation systems, connected vehicle infrastructure, and improving access to PennDOT’s facilities. In exchange for incentive funding, program participants were required to supply PennDOT with the use of current and future network facilities or services.

It’s too early to judge the results of that program but I’ve long thought state DOTs should collaborate more with state telecom officials. There’s a lot of federal and state transportation money that can do double duty in supporting broadband deployment efforts, a subject Prof. Korok Ray and I take up in our recently-released Mercatus Paper, “Smart Cities, Dumb Infrastructure.”

For more, you can find my full testimony at the Mercatus website.

The Ray-Skorup paper, “Smart Cities, Dumb Infrastructure,” about transportation funds and their use in telecom networks is on SSRN.

Last month I spoke at the Innovation Summit in Orlando, hosted by the James Madison Institute. My co-panelists on the transportation panel were Jamal Sowell, President and CEO of Enterprise Florida, state senator Jeff Brandes, who cosponsored Florida’s autonomous vehicle legislation this year, and Stephanie Smith from Uber. Romina Boccia from the Heritage Foundation was our moderator.

Flyer for September 2019 JMI event.

It was a great event and the panel discussion made clear that Florida is at the forefront of autonomous vehicle policy. The panel got me thinking about some nationwide trends that are pushing people towards ride-sharing and, eventually, mobility as a service and autonomous vehicles. Florida seems well positioned but many of these trends will affect the ridesharing and autonomous vehicle market in the next decade.

Rising Cost of Car Ownership

Cars are expensive to own and maintain. Using AAA estimates, the annual cost of a new car in 2019 is $9,300 (nearly $800 per month). These costs are mostly depreciation and insurance, but also include gas, registration, and maintenance.

Used cars are significantly cheaper to own since depreciation is steepest early in a car’s life. I haven’t seen much research on used car costs but out of curiosity I estimated the cost of ownership of our used car. We recently sold my wife’s 2010 Corolla, which she’d bought in 2012. The annual cost of ownership of the Corolla (insurance, maintenance, gas, depreciation) came to about $4,200 ($350 per month).

But costs are much higher for families. Parents adding a teenage boy to their car insurance policy, for instance, can expect their annual insurance costs to jump over $6,000.

Using the AAA numbers and these insurance numbers, we can estimate the costs for adding a new vehicle and a teenage driver for a family budget: from about $15,000 annually (getting a teen driver a new sedan) to about $10,000 annually (getting a teen driver a used compact).

Further, car repair is only going to increase with time. The introduction of sensors and other technology into new cars has caused a spike in repair and insurance costs. Automakers are also adding expensive-to-fix components to engines, like turbochargers and CVTs, in an attempt to comply with federal CAFE standards.

One signal of the increasing costs of repair is rising insurance rates. Over the last four years, the consumer price index for auto insurance increased about 27%, During the same period the CPI for all goods increased about 6%. That increase even exceeds the CPI for hospital services (18%).

This is likely one reason car leasing is becoming more popular, even with good-credit drivers–leasing allows you to shift the (increasing) costs of car depreciation and maintenance to leasing companies.

Mobility as a Service and AVs in Florida

Florida seems to have the perfect recipe for AV and mobility as a service success. First and foremost, they have a governor and state legislature that is welcoming AV companies.

The state also has:

  • many students, retirees, tourists, and uninsured drivers who need rides but don’t use a car regularly
  • very high insurance premiums
  • no-fault auto insurance, which simplifies the claims process in personal injury cases
  • flat terrain and no snow

Suppose a couple in Florida is considering getting a third car, a new car for their teenage son. If their son isn’t interested in getting a drivers license (which is increasingly common) and they live in an area with high penetration of ridesharing services, they might be willing to purchase an annual subscription to mobility as a service. For many families on the fence about getting a second or third car, even a $10,000 annual subscription might make financial sense.

AV tech is slowly but surely approaching mass-market deployment. This month, Waymo announced they were increasing the number of autonomous vehicles on Phoenix-area roads without safety drivers in the front seats. These trends in auto leasing and putting off getting a license is accelerating in urbanized areas in the South. It’s probably where mobility as a service companies and, eventually, AV companies will find their largest potential market.

In the US there is a tangle of communications laws that were added over decades by Congress as–one-by-one–broadcast, cable, and satellite technologies transformed the TV marketplace. The primary TV laws are from 1976, 1984, and 1992, though Congress creates minor patches when the marketplace changes and commercial negotiations start to unravel.

Congress, to its great credit, largely has left alone Internet-based TV (namely, IPTV and vMVPDs) which has created a novel “problem”–too much TV. Internet-based TV, however, for years has put stress on the kludge-y legacy legal system we have, particularly the impenetrable mix of communications and copyright laws that regulates broadcast TV distribution.

Internet-based TV does two things–it undermines the current system with regulatory arbitrage but also shows how tons of diverse TV programming can be distributed to millions of households without Congress (and the FCC and the Copyright Office) injecting politics into the TV marketplace.

Locast TV is the latest Internet-based TV distributor to threaten to unravel parts the current system. In July, broadcast programmers sued Locast (its founder, David Goodfriend) and in September, Locast filed its own suit against the broadcast programmers.

A portion of US TV regulations.

Many readers will remember the 2014 Aereo decision from the Supreme Court. Much like Aereo, Locast TV captures free broadcast TV signals in the markets it operates and transmits the programming via the Internet to viewers in that market. That said, Locast isn’t Aereo.

Aereo’s position was that it could relay broadcast signals without paying broadcasters because it wasn’t a “cable company” (a critical category in copyright law). The majority of the Supreme Court disagreed; Aereo closed up shop.

Locast has a different position: it says it can relay broadcast signals without paying because it is a nonprofit.

It’s a plausible argument. Federal copyright law has a carveout allowing “nonprofit organizations” to relay broadcast signals without payment so long as the nonprofit operates “without any purpose of direct or indirect commercial advantage.”

The broadcasters are focusing on this latter provision, that any nonprofit taking advantage of the carveout mustn’t have commercial purpose. David Goodfriend, the Locast founder, is a lawyer and professor who, apparently, sought to abide by the law. However, the broadcasters argue, his past employment and commercial ties to pay-TV companies mean that the nonprofit is operating for commercial advantage.

It’s hard to say how a court will rule. Assuming a court takes up the major issues, judges will have to decide what “indirect commercial advantage” means. That’s a fact-intensive inquiry. The broadcasters will likely search for hot docs or other evidence that Locast is not a “real” nonprofit. Whatever the facts are, Locast’s arbitrage of the existing regulations is one that could be replicated.

Nobody likes the existing legacy TV regulation system: Broadcasters dislike being subject to compulsory licenses; Cable and satellite operators dislike being forced to carry some broadcast TV and to pay for a bizarre “retransmission” right. Copyright holders are largely sidelined in these artificial commercial negotiations. Wholesale reform–so that programming negotiations look more like the free-market world of Netflix and Hulu programming–would mean every party has give up something they like improve the overall system.

The Internet’s effect on traditional providers’ market share has been modest to date, but hopefully Congress will anticipate the changing marketplace before regulatory distortions become intolerable.

Additional reading: Adam Thierer & Brent Skorup, Video Marketplace Regulation: A Primer on the History of Television Regulation and Current Legislative Proposals (2014).

Last month, Senator and presidential candidate Elizabeth Warren released a campaign document, Plan for Rural America. The lion’s share of the plan proposed government-funded and -operated health care and broadband. The broadband section of the plan proposes raising $85 billion (from taxes?) to fund rural broadband grants to governments and nonprofits. The Senator then placed a Washington Post op-ed to decrying the state of rural telecommunications in America. 

While it’s commendable she has a plan, it doesn’t materially improve upon existing, flawed rural telecom subsidy programs, which receive only brief mention. In particular, the Plan places an unwarranted faith in the power of government telecom subsidies, despite red flags about their efficacy. The op-ed misdiagnoses rural broadband problems and somehow lays decades of real and perceived failure of government policy at the feet of the current Trump FCC, and Chairman Pai in particular.

As a result, the proposals–more public money, more government telecom programs–are the wrong treatment. The Senator’s plan to wire every household is undermined by “the 2% problem”–the cost to build infrastructure to the most remote homes is massive. 

Other candidates (and perhaps President Trump) will come out with rural broadband plans so it’s worth diving into the issue. Doubling down on a 20 year old government policy–more subsidies to more providers–will mostly just entrench the current costly system.

How dire is the problem?

Somewhere around 6% of Americans (about 20 million people) are unserved by a 25 Mbps landline connection. But that means around 94% of Americans have access to 25 Mbps landline broadband. (Millions more have access if you include broadband from cellular and WISP providers.)

Further, rural buildout has been improving for years, despite the high costs. From 2013 to 2017, under Obama and Trump FCCs, landline broadband providers covered around 3 or 4 million new rural customers annually. This growth in coverage seems to be driven by unsubsidized carriers because, as I found in Montana, FCC-subsidized telecom companies in rural areas are losing subscribers, even as universal service subsidies increased.

This rural buildout is more impressive when you consider that most people who don’t subscribe today simply don’t want Internet access. Somewhere between 55% to 80% of nonadopters don’t want it, according to Department of Commerce and Pew surveys. The fact is, millions of rural homes are connected annually despite the fact that most nonadopters today don’t want the service.

These are the core problems for rural telecom: (1) poorly-designed, overlapping, and expensive programs and (2) millions of consumers who are uninterested in subscribing to broadband.

Tens of billions for government-operated networks

The proposed new $85 billion rural broadband fund gets most of the headlines. It resembles the current universal service programs–the fund would disburse grants to providers, except the grants would be restricted to nonprofit and government operators of networks. Most significant: Senator Warren promises in her Plan for Rural America that, as President, she will “make sure every home in America has a fiber broadband connection.” 

Every home?

This fiber-to-every-farm idea had advocates 10 years ago. The idea has failed to gain traction because it runs into the punishing economics of building networks.

Costs rise non-linearly for the last few percent of households and $85 billion would bring fiber only to a small sliver of US households. According to estimates from the Obama FCC, it would cost $40 billion to build fiber to the final 2% of households. Further, the network serving those 2% of households would require an annual subsidy of $2 billion simply to maintain those networks since revenues are never expected to cover ongoing costs. 

Recent history suggests rapidly diminishing returns and that $85 billion of taxpayer money will be misspent. If the economics wasn’t difficult enough, real-world politics and government inefficiency also degrade lofty government broadband plans. For example, Australia’s construction of a nationwide publicly-owned fiber network–the nation’s largest-ever infrastructure project–is billions over budget and years behind schedule. The RUS broadband grant debacle in the US only supports the case that $85 billion simply won’t go that far. As Will Rinehart says, profit motive is not the cause of rural broadband problems. Government funding doesn’t fix the economics and government efficacy.

Studies will probably be come out saying it can be done more cheaply but America has been running a similar experiment for 20 years. Since 1998, as economists Scott Wallsten and Lucía Gamboa point out, the US government has spent around $100 billion on rural telecommunications. What does that $100 billion get? Mostly maintenance of existing rural networks and about a 2% increase of phone adoption.

Would the Plan improve or repurpose the current programs and funding? We don’t know. The op-ed from Sen. Warren complains that:

the federal government has shoveled more than a billion in taxpayer dollars per year to private ISPs to expand broadband to remote areas, but these providers have done the bare minimum with these resources.

This understates the problem. The federal government “shovels” not $1 billion, but about $5 billion, annually to providers in rural areas, mostly from the Universal Service Fund Congress established in 1996.

As for the “public option for broadband”–extensive construction of publicly-run broadband networks–I’m skeptical. Broadband is not like a traditional utility. Unlike electricity, water, or sewer, a city or utility network doesn’t have a captive customer base. There are private operators out there.

As a result, public operation of networks is a risky way to spend public funds. Public and public-private operation of networks often leads to financial distress and bankruptcy, as residents in Provo, Lake County, Kentucky, and Australia can attest.

Rural Telecom Reform

I’m glad Sen. Warren raised the issue of rural broadband, but the Plan’s drafters seem uninterested in digging into the extent of the problem and in solutions aside from throwing good money after bad. Lawmakers should focus on fixing the multi-billion dollar programs already in existence at the FCC and Ag Department, which are inexplicably complex, expensive to administer, and unequal towards ostensible beneficiaries. 

Why, for instance, did rural telecom subsidies break down to about $11 per rural household in Sen. Warren’s Massachusetts in 2016 when it was about $2000 per rural household in Alaska? 

Alabama and Mississippi have similar geographies and rural populations. So why did rural households in Alabama receive only about 20% of what rural Mississippi households receive? 

Why have administrative costs as a percentage of the Universal Service Fund more than doubled since 1998? It costs $200 million annually to administer the USF programs today. (Compare to the FCC’s $333 million total budget request to Congress in FY 2019 for everything else the FCC does.)

I’ve written about reforms under existing law, like OTARD rule reform–letting consumers freely install small, outdoor antennas to bring broadband to rural areas–and transforming the current program funds into rural broadband vouchers. There’s also a role for cities and counties to help buildout by constructing long-lasting infrastructure like poles, towers, and fiber conduit. These assets could be leased out a low cost to providers.

Conclusion

After years of planning, the FCC reformed some of the rural telecom program in 2017. However, the reforms are partial and it’s too early to evaluate the results. The foundational problem is with the structure of existing programs. Fixing that structure should be a priority for any Senator or President concerned about rural broadband. Broadband vouchers for rural households would fix many of the problems, but lawmakers first need to question the universal service framework established over 20 years ago. There are many signs it’s not fit for purpose.