Miscellaneous

My latest AIER column examines the impact increased lobbying and regulatory accumulation have on entrepreneurialism and innovation more generally. Unsurprisingly, it’s not a healthy relationship. A growing body of economic evidence concludes that increases in the former lead to much less of the latter.

This is a topic that my Mercatus Center colleagues and I have done a lot of work on through the years. But what got me thinking about the topic again was a new NBER working paper by economists Germán Gutiérrez and Thomas Philippon entitled, “The Failure of Free Entry.” Their new study finds that “regulations and lobbying explain rather well the decline in the allocation of entry” that we have seen in recent years.

Many economists have documented how business dynamism–new firm creation, entry, churn, etc–appears to have slowed in the US. Explanations for why vary but Gutiérrez and Philippon show that, “regulations have a negative impact on small firms, especially in industries with high lobbying expenditures.” Their results also document how regulations, “have a first order impact on incumbent profits and suggest that the regulatory capture may have increased in recent years.”

In other words, lobbying and cronyism breed a culture of rent-seeking, over-regulation, and rule accumulation that directly limit new startup activity and innovation more generally. This is a recipe for economic stagnation if left unchecked. Continue reading →

The urban air mobility stories keep stacking up in 2019. A few highlights and a few thoughts.

Commercial developments

There have been tons of urban air mobility announcements, partnerships, and demos in 2019. EHang, the Chinese drone maker, seems to be farthest along in eVTOL development, though many companies are working with regulators to bring about eVTOL services in the next five years. 

In April, representatives said EHang will start selling its two-passenger, autonomous eVTOL next year for about $350,000 to commercial operators. Ehang’s co-founder says its 2-passenger autonomous eVTOL is already completing routine flights in China for tourists between a hotel and local attractions.

Uber recently announced they’ll offer shared-ride helicopter service between Manhattan and JFK airport, starting in July. This week, Voom (Airbus) announced they will expand their helicopter ridesharing service to San Francisco. They’ve been operating in Sao Paulo and Mexico City already.

These helicopter rides are targeting popular urban routes (airport-to-airport, CBD-to-airport, etc.) for customers who are willing to pay to shorten a one-hour car ride to a ten-minute helicopter ride. Fees are typically $150 to $250 one-way. Both companies want to get a sense of demand, price, and frequency for eVTOL services.

[BS – July 9 update: Last week Xin Gou, a pilot, reported on Twitter that EHang had sold 18 of its 2-passenger eVTOL aircraft, 10 in China, 8 overseas. To my knowledge, these are the first sales of passenger eVTOL aircraft in the world.]

What’s the Plan?

This makes the development of airspace markets and unmanned traffic management (UTM) systems all the more urgent. What regulators must guard against is first-movers squatting on high-revenue aerial routes.

Airspace is nominally a common-pool resource, rationed via regulation and custom. That worked tolerably well for the Wright brother era and the jet age. Still, there are massive distortions and competitive problems because an oligopoly of first movers attained popular routes and airport terminals. The common-pool resource model for airspace also leaves regulators with few tools to ration access sensibly.

From my airspace policy paper:

For example, in 1968, nearly one-third of peak-time New York City air traffic—the busiest region in the United States—was general aviation (that is, small, personal) aircraft. To combat severe congestion, local authorities raised minimum landing fees by a mere $20 (1968 dollars) on sub 25-seat aircraft. General aviation traffic at peak times immediately fell by more than 30 percent, suggesting that a massive amount of pre-July 1968 air traffic in the region was low value. The share of aircraft delayed by 30 or more minutes fell from 17 percent to about 8 percent. Similarly, Logan Airport raised fees on small aircraft in the 1980s in order to lessen congestion. The scheme worked, and general aviation traffic fell by about one-third, though the fee hike was later overturned.

There’s a revolution in aviation policy occurring. The arrival of drones, eVTOL, and urban air mobility requires a totally different framework. It seems inevitable that a layer-cake or corridor approach to airspace management will develop, even though the FAA currently resists that. As with American frontier or radio spectrum: a demand shock to Ostromian common pool resource leads to enclosure and property rights.

Already, first movers and the government are collaborating on UTM and airspace policy. But regulators must resist letting collaboration today degrade into oligopoly tomorrow. This early collaboration on technology and norms is necessary but the regulators will be under immense pressure, inside and outside the agency, to have a single UTM provider, or a few hand-picked vendors. 

A single UTM system or a tightly-integrated system with a few private system operators would reproduce many of the problems with today’s air traffic management. It is very hard to update information-rich systems, especially air traffic control systems, the delayed, over-budget NextGen modernization shows. Today there are 16,000 FAA workers working on the NextGen project, which has been ongoing since 1983. UTM will be an even more information-rich system. An system-wide upgrade to UTM would make NextGen modernization look simple by comparison.

Further, once the urban air mobility market develops, the first movers (UTM and eVTOL operators) will resist newcomers and new UTM technologies in the future. Exclusive aerial corridors, as opposed to shared corridors planned for today by regulators, would allow competitive UTM systems with only basic interoperability requirements.

Quick Hits

NETT Council: In March, USDOT Secretary Chao announced the formation of the Non-Traditional and Emerging Transportation Technology Council. It sounds great, and one of the likely topics the Council will take up is urban air mobility.

ASI Aviation Report, “Taking Off”: The Adam Smith Institute (UK) this week published an excellent report from Matthew Lesh about improving competition and service in aviation. The UK often leads the world in deregulation and market-based management of government property (like AIP in spectrum policy), and ASI has been influential in aviation policy in particular. Report highlights:

  1. Analysis of terminal competition policies for Heathrow (which is in the midst of a major expansion project)
  2. Proposes additional slot auctions for takeoff and landing slots at UK airports
  3. Endorses aerial corridor auctions for air taxis and eVTOL

Government study of airspace auctions: My proposal that the FAA auction aerial corridors for eVTOL caught the attention of the FAA’s Drone Advisory Committee and was included in a working group’s 2018 report about ways to finance drone and eVTOL regulation. Section 360 of the FAA Reauthorization Act, passed a few months after the working group report came out, then instructed the GAO to study ways of financing drone and eVTOL regulation. The law specifies that the GAO must study the six proposals in that working group report, including the auction of aerial corridors.

Lincoln Network Conference: I recently had the privilege of speaking at the Lincoln Network’s Reboot American Innovation conference. Jamie Boone (CTA) and I gave a fireside chat about the fast-moving urban air mobility sector. Matt Parlmer, founder of Ohlogen, was a great moderator. Video here.

eVTOL in North Carolina: The North Carolina state appropriations bill, which is nearing passage, allocates some funds to the Lt. Governor’s office to study eVTOLs, consult with experts, and convene an eVTOL summit in the next year. The Lt. Governor might also form a state advisory committee on eVTOL, a good, forward-looking policy for states given the rapid pace of progress in urban air mobility. To my knowledge, North Carolina is the first state to dedicate funding for study of this industry.

Cato Unbound is taking on the issue of tech expertise this month and the lead essay came from Kevin Kosar, who argues for the revival of the Office of Technology Assessment. As he explains,

[N]o one wants Congress enacting policies that make us worse off, or that delay or stifle technologies that improve our lives. And yet this kind of bad policy happens with lamentable frequency. Pluralistic politics inevitably features some self-serving interests that are more powerful and politically persuasive than others. This is why government often undertakes bailouts and other actions that are odious to the public writ large.  

He continues, “Congress’s ineptitude in [science and technology policy] has been richly displayed.” To help embed expertise in science and technology policy, Kosar argues for the revival of the Office of Technology Assessment, which was established in 1972 and defunded in 1995.

I have been on the OTA beat for a little while now, and so I offered some criticism of Kosar’s proposal, which you can find here. I’ll lay out my cards: I’ve been skeptical of reving the OTA in the past and I remain so. Here is my key graf on that:

Elsewhere, I have argued that the OTA should be seen as a last resort; there are other ways of embedding expertise in Congress, like boosting staff and reforming hiring practices. The following essay makes a slightly different argument, namely, that the history of the OTA shows the razor wire on which a revived version of agency will have to balance. In its early years, the OTA was dogged by accusations of partiality. Having established itself as a neutral party throughout the 1980s, the OTA was abolished because it failed to distinguish itself among competing agencies. There is an underlying political economy to expertise that makes the revival of the OTA difficult, undercutting it as an option for expanding tech expertise. In a modern political environment where scientific knowledge is politicized and budgets are tight, the OTA would likely face the hatchet once again. Continue reading →

Two weeks ago, Gov. Polis signed a bill that generally cuts off Colorado state funds from ISPs that commit “net neutrality violations” in the state. Oddly, I’ve seen no coverage from national outlets and barely a mention from local outlets. Perhaps journalists and readers have tired from what Larry Downes has dubbed the net neutrality farce, a debate about Internet regulation that has distracted the FCC and lawmakers for over a decade.

There’s not much new in the net neutrality debate, but Colorado did tread new ground: a House amendment to allow ISPs to filter adult content barely failed, on a tied vote 32-32. Net neutrality in the US runs into First Amendment and Section 230 problems, and that amendment is the first time I’ve seen the issue raised by a state legislature.

A few thoughts on the law because in March I was invited to testify before a Colorado House committee about net neutrality, broadband, and the policy implications of the then-pending bill. I commended the bill drafters for scrupulously attempting to narrow their bill to intra-state consumer protection issues. Nevertheless, it was my view that the Colorado law, as written, wouldn’t survive judicial review if litigated.

States can have agreements with vendors and contractors and can require them to abide by certain contractual terms. However, courts have held that states cannot, as Seth Cooper has pointed out, use their contractual relationships with firms to extract concessions that are “tantamount to regulation.” State agencies cannot attempt an end-around federal laws that prevent state regulation of Internet services generally, and net neutrality regulation in particular.

My testimony:

Good afternoon. My name is Brent Skorup and I am a senior research fellow at the Mercatus Center at George Mason University. I also serve on the Broadband Deployment Advisory Committee of the Federal Communications Commission (FCC).

It is commendable that state legislatures, governors, and cities around the country, including in Colorado, are prioritizing broadband deployment. The focus should remain on the pressing broadband issues of competition and deployment. The political battles in Washington, DC, about net neutrality, which I have observed over the past decade, have alarmingly spread to statehouses in recent months, and they will distract from far more important issues.

Lawmakers should enter the debate with their eyes wide open about the stakes and the unintended effects of internet regulation. By imposing network management rules on certain providers, SB 19-078 conflicts with federal policy, codified in the Telecommunications Act, that internet access should be “unfettered by Federal or State regulation.”

First, net neutrality laws and regulations do not accomplish what they purportedly accomplish. As the FCC revealed when it defended its net neutrality regulations in federal court in 2016, any no-blocking rule is mostly unenforceable. As a tech journalist put it, internet service providers (ISPs) can “exempt [themselves] from the net neutrality rules”—the rules are “essentially voluntary.” The same problem arises with state net neutrality laws.

Second, state internet regulations are unlikely to survive judicial review. Internet access is inherently interstate: simply streaming a YouTube video or sending an email often transmits data across state lines. State attempts to regulate treatment of internet access therefore likely violate federal law, which vests authority to regulate interstate communications with the FCC.

Third, the bill penalizes small, rural carriers. There’s a saying in politics: “If you’re not at the table, you’re on the menu.” It appears that Colorado’s rural broadband providers are “on the menu.” The bill applies internet regulations only to companies receiving state support (13 companies, each one serving rural areas). With the exception of CenturyLink, these are very small telecommunications companies, and the smallest had 64 customers. It is a puzzle why the state would add regulations and compliance costs to rural ISPs at a time when the FCC and most states are doing everything possible to help deploy broadband in rural areas.

This is not a plea to “do nothing” in Colorado regarding broadband. The FCC’s Broadband Deployment Advisory Committee has several recommendations for states and localities to improve broadband deployment.

Further, the FCC and some states are considering making it easier for private property owners to install wireless antennas without local regulation and fees, much like how satellite dishes are installed.

Finally, the legislature could also urge flexibility from the FCC regarding the federal high-cost fund, which disburses about $60 million annually to carriers in Colorado. My preliminary estimates using FCC data suggest that, under a new voucher program, every rural household in Colorado could receive $15 to $20 per month to reduce their monthly broadband bill.

Testimony on the Mercatus website here.

[This essay originally appeared on the AIER blog on May 23, 2019 under the title, “Spring Cleaning for the Regulatory State.”]

_____________________________

Spring is in full blossom, and many of us are in the midst of our annual house-cleaning ritual. A regular deep clean makes good sense because it makes our living spaces more orderly and gets rid of the gunk and grime that has amassed over the past year.

Unfortunately, governments almost never engage in their own spring-cleaning exercise. Statutes and regulations continue to accumulate, layer by layer, until they suffocate not only economic opportunity, but also the effective administration of government itself. Luckily, some states have realized this and have taken steps to help address this problem.

Mountains of Regulations

First, here are some hard facts about regulatory accumulation:

  • Red tape grows: Since the first edition of his annual publication Ten Thousand Commandments in 1993, Wayne Crews has documented how federal agencies have issued 101,380 rules. Other reports find agency staffing levels jumped from 57,109 to 277,163 employees from 1960 to 2017, while agency budgets swelled in real terms from $3 billion in 1960 to $58 billion in 2017 (2009$).
  • Nothing ever gets cleaned up: A Deloitte survey of U.S. Code reveals that 68 percent of federal regulations have never been updated and that 17 percent have only been updated once. If a company never updated its business model, it would fail eventually. But governments get away with doing the same thing without any fear of failure. “If it were a country, U.S. regulation would be the world’s eighth-largest economy, ranking behind India and ahead of Italy,” Crews notes.
  • The burden of regulatory accumulation is getting worse: “The estimate for regulatory compliance and economic effects of federal intervention is $1.9 trillion annually,” Crews finds, which is equal to 10 percent of the U.S. gross domestic product for 2017. When federal spending is added to regulatory costs are added to federal spending, Crews finds, the burden equals $4.173 trillion, or 30 percent of the entire economy. Mercatus Center research has found that “economic growth in the United States has, on average, been slowed by 0.8 percent per year since 1980 owing to the cumulative effects of regulation.” This means that “the US economy would have been about 25 percent larger than it actually was as of 2012” if regulation had been held to roughly the same aggregate level it stood at in 1980.

In sum, the evidence shows that the red tape is growing without constraint, hindering entrepreneurship and innovation, deterring new investment, raising costs to consumers, limiting worker opportunities/wages, and undermining economic growth.

Regulations accumulate in this fashion because the administrative state is on autopilot. Legislatures pass broad statutes delegating ambiguous authority to agencies. Bureaucrats are then free to roll the regulatory snowball down the hill until it has become so big that its momentum cannot be stopped.

The Death of Common Sense

Policy makers enact new rules with the best of intentions, of course, but we should not assume that the untrammeled growth of the regulatory state produces positive results. There is no free lunch, after all. Every regulation is a restriction on opportunities for experimentation with new and potentially better ways of doing things. Sometimes such restrictions make sense because regulations can pass a reasonable cost-benefit test. It would be foolish to assume that all regulations on the books do.

Spring cleaning for the regulatory state, therefore, should be viewed as an exercise in “good governance.” The goal is not to get rid of all regulations. The goal is to make sure that rules are reasonable and cost-effective so that the public can actually understand the law and get the highest value out of their government institutions.

Philip K. Howard, founder and chair of the nonprofit coalition Common Good and the author of The Death of Common Sense, has written extensively about how regulatory accumulation has become a chronic problem. “Too much law,” he argues, “can have similar effects as too little law.” “People slow down, they become defensive, they don’t initiate projects because they are surrounded by legal risks and bureaucratic hurdles,” Howard notes. “They tiptoe through the day looking over their shoulders rather than driving forward on the power of their instincts. Instead of trial and error, they focus on avoiding error.”

In such an environment, risk-taking and entrepreneurialism are more challenging and economic dynamism suffers. But regulatory accumulation also hurts the quality of government institutions and policies, which become fundamentally incomprehensible or illogical. “Society can’t function when stuck in a heap of accumulated mandates of past generations,” Howard concludes. This is why an occasional regulatory house cleaning is essential to unleash economic opportunity and improve the functioning of our democratic institutions.

Regulatory House Cleaning Begins

Reforms to address this problem are finally happening. In a series of new essays, my colleague James Broughel has documented how several states — including IdahoOhioVirginia, and New Jersey — are undertaking serious efforts to get regulatory accumulation under control. They are utilizing a variety of mechanisms, including “regulatory reduction pilot programs” and “red tape review commissions.” Recently, Idaho actually initiated a sunset of its entire regulatory code and will now try to figure out how to clean up its 8,200 pages of regulations containing 736 chapters of state rules.

Meanwhile, other states are undertaking serious reform in one of the worst forms of regulatory accumulation: occupational licenses. The Federal Trade Commission notes that roughly 30 percent of American jobs require a license today, up from less than 5 percent in the 1950s. Research by economist Morris Kleiner and others finds that “restrictions from occupational licensing can result in up to 2.85 million fewer jobs nationwide, with an annual cost to consumers of $203 billion.” And many of the rules do not even serve their intended purpose. A major 2015 Obama administration report on the costs of occupational licensing concluded that “most research does not find that licensing improves quality or public health and safety.”

ArizonaWest Virginia, and Nebraska are among the leaders in reforming occupational-licensing regimes using a variety of approaches. In some cases, the reforms sunset licensing rules for specific professions altogether. Other proposals grant workers reciprocity to use a license they obtained in another state. Finally, some states have proposed letting most professions operate without any license at all but then requiringall, but then require them to make it clear to consumers that they are unlicensed.

The Need for a Fresh Look

Sunsets are not silver-bullet solutions, and the recent experience with sunsetting and “de-licensing” requirements at the state level has been mixed because many legislatures ignore or circumvent requirements. Nonetheless, sunsets can still help prompt much-needed discussions about which rules make sense and which ones no longer do.

Sunsets can be forward-looking, too. I have proposed that when policy makers craft new laws, especially for fast-paced tech sectors, they should incorporate a clause that what we might think of as “the Sunsetting Imperative.” It would demand that any existing or newly imposed technology regulation should include a provision sunsetting the law or regulation within two years. Reforms like these are also sometimes referred to as “temporary legislation” or “fresh look” requirements. Policy makers can always reenact rules that are still relevant and needed.

By forcing a periodic spring cleaning, sunsets and fresh-look requirements can help stem the tide of regulatory accumulation and ensure that only those policies that serve a pressing need remain on the books. There is no good reason for governments not to clean up their messes on occasion, just like the rest of us have to.

Congress should let the Satellite Television Extension and Localism Act Reauthorization (STELAR) of 2014 expire at the end of this year. STELAR is the most recent reincarnation of the Satellite Home Viewer Act of 1988, a law that has long since outlived it’s purposes.

Owners of home satellite dishes in the 1980s—who were largely concentrated in rural areas—were receiving retransmission of popular television programs via satellite carriers in apparent violation of copyright law. When copyright owners objected, Congress established a compulsory, statutory license mandating that content providers allow secondary transmission via satellite to areas unserved by either a broadcaster or a cable operator, and requiring satellite carriers to compensate copyright holders at the rate of 3 cents per subscriber per month for the retransmission of a network TV station or 12 cents for a cable superstation.

The retransmission fees were purposely set low to help the emerging satellite carriers get established in the marketplace when innovation in satellite technology still had a long way to go. Today the carriers are thriving business enterprises, and there is no need for them to continue receiving subsidies. Broadcasters, on the other hand, face unprecedented competition for advertising revenue that historically covered the entire cost of content production.

Today a broadcaster receives 28 cents per subscriber per month when a satellite carrier retransmits their local television signal. But the fair market value of that signal is actually $2.50, according to one estimate.

There is no reason retransmission fees cannot be “determined in the marketplace through negotiations among carriers, broadcasters and copyright holders,” as the Reagan administration suggested in 1988.

Aside from perpetuating an unjustified subsidy, renewal of STELAR may deprive owners of home satellite dishes in the nation’s twelve smallest Designated Market Areas from receiving programming from their own local broadcast TV stations.

Due to severe capacity constraints inherent in satellite technology in the 1980s, the statutory license originally allowed satellite carriers to retransmit a single, distant signal (e.g. from a New York or Los Angeles network affiliate) throughout their entire footprint. As the technology has improved, the statutory license has been expanded in recent years to include local-into-local retransmission. DISH Network, which already provides local-into-local retransmission throughout the nation (in all 210 DMAs), has demonstrated that a statutory license for distant signals is no longer necessary or warranted.

Although DirecTV does not yet offer nationwide local-into-local retransmission, this is a voluntary business decision that should not dictate the renewal of a statutory license based on 30 year old technology.


An interesting divide has opened up in recent months among right-of-center groups about what the FCC should do with the “C Band.” A few weeks ago, the FCC requested public comment on how to proceed with the band.

The C Band is 500 MHz of spectrum that the FCC, like regulators around the globe, dedicated for satellite use years ago and gave to satellite companies to share among each other. Satellite operators typically use it to transmit cable programming to a regional cable network operations center, where it is bundled and relayed to cable subscribers. However, the C Band would work terrifically if repurposed for 5G and cellular services. As Joe Kane explained in a white paper, the FCC and telecom companies are exploring various ways of accomplishing that.

Free-market groups disagree. Should the FCC prioritize:

The quick deployment of new wireless services? Or:

Deficit reduction and limiting FCC-granted windfalls?

This is a complex question since we’re dealing with the allocation of public property. Both sides, in my view, have a defensible free-market position. There are other non-trivial C Band issues like interference protection and the FCC’s authority to act here, but I’ll address the ideological split on the right.

The case for secondary markets

The full 500 MHz of “clean” C Band in the US would be worth tens of billions to cellular companies. However, the current satellite users don’t want to part with all of it and a group of satellite companies using the spectrum estimate they could sell 200 MHz to cellular carriers if the FCC would liberalize its rules to allow flexible uses (like 5G), not merely satellite services. The satellite providers would then be able to sell much of their spectrum on the secondary market (probably to cellular providers) at a nice premium.

Prof. Dan Lyons and Roslyn Layton wrote in support of the secondary market plan on the AEI blog and at Forbes, respectively. Joe Kane also favors the approach. As they say, the benefit of secondary market sales is that it will likely lead a significant and fast repurposing of the C Band for mobile use. The consumer benefits of dezoned spectrum are large and with every year of inaction, billions of dollars of consumer welfare evaporate. Hazlett and Munoz estimate that spectrum reallocated from a restricted use to flexible use generates annual consumer benefits in the same order of magnitude as auction value of the spectrum.

I’d add that there’s a history of the FCC de-zoning spectrum (SMR spectrum in 2004, EBS spectrum in 2004, AWS-4 in 2011, WCS spectrum in 2012). The FCC is considering doing this with some government spectrum that Ligado or others could repurpose for mobile broadband. In these cases, the FCC upzoned spectrum so that it can be used for higher-valued uses, not legacy uses required by previous FCCs. The circumstances and technologies vary, but some of these bands were repurposed quickly for better uses by cellular providers and are used for 4G LTE today by tens of millions of Americans.

The case for FCC auction

Liberalizing spectrum quickly gets spectrum to higher-valued uses but does raise the complaint that the existing users are gaining an unfair windfall. I’m not sure when the C Band was allocated for satellite but many legacy assignments of spectrum were given to industries for free.

When the FCC “upzones” spectrum, it typically increases the value of the band. The “secondary market” plan is akin to the government giving away a parcel of public land to a developer to be used for a gas station, then deciding years later to upzone the land so that condo or office buildings can be built on it. It’s a better use for the land, but the gas station operator gains a big windfall when the property value increases. Not only is there a windfall, the government captures no revenue from the increase in the value of public property.

Free-market groups like Americans for Tax Reform, Taxpayers Protection Alliance, and Citizens Against Government Waste favor the FCC reclaiming the spectrum from satellite providers, perhaps via incentive auction, and collecting government revenue by re-selling it. If the FCC went the incentive auction route, the FCC would purchase the “satellite spectrum” (ie a low price) from the current C Band users, upzone it, and re-sell that spectrum as “mobile spectrum” (ie a high price) in an open auction. The FCC and the Treasury pocket the difference, probably several billion dollars here.

The FCC has only done one incentive auction, the 600 MHz auction. There, the FCC purchased “TV spectrum” from broadcasters and re-sold it to wireless carriers.

The benefit of this is deficit reduction and there’s more perceived fairness since there’s no big, FCC-granted windfall to legacy users. The downside is that it’s a slower, more complicated process since the FCC is deeply involved in the spectrum transfer. Arguably, however, the FCC should be deeply involved and interested in government revenue since spectrum is public property.

My view

A few years ago I would have definitely favored speed and the secondary market plan. I still lean towards that approach but I’m a little more on the fence after reading Richard Epstein’s work and others’ about the “public trust doctrine.” This is a traditional governance principle that requires public actors to receive fair value when disposing of public property. It prevents public institutions from giving discounted public property to friends and cronies. Clearly, cronyism isn’t the case here and FCC can’t undo what FCCs did generations ago in giving away spectrum. I think the need for speedy deployment trumps the windfall issue here, but it’s a closer call for me than in the past.

One proposal that hasn’t been contemplated with the C Band but might have merit is an overlay auction with a deadline. With such an auction, the FCC gives incumbent users a deadline to vacate a band (say, 5 years). The FCC then auctions flexible-use licenses in the band. The FCC receives the auction revenues and the winning bidders are allowed to deploy services in the “white spaces” unoccupied by the incumbents. The winning bidders are allowed to pay the incumbents to move out before the deadline.

With an overlay auction, you get fairly rapid deployment–at least in the white spaces–and the government gains revenue from the auction. This type of auction was used to deploy cellular (PCS) in the 1990s and cellular (AWS-1) in the 2000s. However, incumbents dislike it because the deadline devalues their existing spectrum holdings.

I think overlay auctions should be considered in more spectrum proceedings because they avoid the serious windfall problems while also allowing rapid deployment of new services. That doesn’t seem in the cards, however, and secondary markets seems like the next best option.

Image result for joseph schumpeterIn my first essay for the American Institute for Economic Research, I discuss what lessons the great prophet of innovation Joseph Schumpeter might have for us in the midst of today’s “techlash” and rising tide of techopanics.  I argue that, “[i]f Schumpeter were alive today, he’d have two important lessons to teach us about the techlash and why we should be wary of misguided interventions into the Digital Economy.” Specifically:
We can summarize Schumpeter’s first lesson in two words: Change happens. But disruptive change only happens in the right policy environment. Which gets to the second great lesson that Schumpeter can still teach us today, and which can also be summarized in two words: Incentives matter. Entrepreneurs will continuously drive dynamic, disruptive change, but only if public policy allows it.
Schumpeter’s now-famous model of “creative destruction” explained why economies are never in a state static equilibrium and that entrepreneurial competition comes from many (usually completely unpredictable) sources. “This kind of competition is much more effective than the other,” he argued, because the “ever-present threat” of dynamic, disruptive change, “disciplines before it attacks.”
But if we want innovators to take big risks and challenge existing incumbents and their market power, then it is essential that we get policy incentives right or else this sort of creative destruction will never come about. The problem with too much of today’s “techlash” thinking is that it imagines the current players are here to stay and that their market power is unassailable. Again, that is static “snapshot” thinking that ignores the reality that new generations of entrepreneurs are in a sort of race for a prize and will make big bets on the future in the face of seemingly astronomical odds against their success. But we have to give them a chance to win that “prize” if we want to see that dynamic, disruptive change happen.
As always, we have much to learn from Schumpeter. Jump over to the AIER website to read the entire essay.

Many have likened efforts to build out rural broadband today to the accomplishments of rural electrification in the 1930s. But the two couldn’t be further from each other. From the structure of the program and underlying costs, to the impact on productivity, rural electrification is drastically different than current efforts to get broadband in rural regions. My recent piece at ReaclClearPolicy explores some of those differences, but there is one area I wasn’t able to explore, the question of cost. If a government agency, any government agency for that matter, was able to repeat the dramatic reduction in cost for broadband, the US wouldn’t have a deployment problem. Continue reading →

It was my great pleasure to recently join Paul Matzko and Will Duffield on the Building Tomorrow podcast to discuss some of the themes in my last book and my forthcoming one. During our 50-minute conversation, which you can listen to here, we discussed:

  • the “pacing problem” and how it complicates technological governance efforts;
  • the steady rise of “innovation arbitrage” and medical tourism across the globe;
  • the continued growth of “evasive entrepreneurialism” (i.e., efforts to evade traditional laws & regs while innovating);
  • new forms of “technological civil disobedience;”
  • the rapid expansion of “soft law” governance mechanism as a response to these challenges; and,
  • craft beer bootlegging tips!  (Seriously, I move a lot of beer in the underground barter markets).

Bounce over to the Building Tomorrow site and give the show a listen. Fun chat.