Economics

Google’s announcement this week of plans to expand to dozens of more cities got me thinking about the broadband market and some parallels to transportation markets. Taxi cab and broadband companies are seeing business plans undermined with the emergence of nimble Silicon Valley firms–Uber and Google Fiber, respectively.

The incumbent operators in both cases were subject to costly regulatory obligations in the past but in return they were given some protection from competitors. The taxi medallion system and local cable franchise requirements made new entry difficult. Uber and Google have managed to break into the market through popular innovations, the persistence to work with local regulators, and motivated supporters. Now, in both industries, localities are considering forbearing from regulations and welcoming a competitor that poses an economic threat to the existing operators.

Notably, Google Fiber will not be subject to the extensive build-out requirements imposed on cable companies who typically built their networks according to local franchise agreements in the 1970s and 1980s. Google, in contrast, generally does substantial market research to see if there is an adequate uptake rate among households in particular areas. Neighborhoods that have sufficient interest in Google Fiber become Fiberhoods.

Similarly, companies like Uber and Lyft are exempted from many of the regulations governing taxis. Taxi rates are regulated and drivers have little discretion in deciding who to transport, for instance. Uber and Lyft drivers, in contrast, are not price-regulated and can allow rates to rise and fall with demand. Further, Uber and Lyft have a two-way rating system: drivers rate passengers and passengers rate drivers via smartphone apps. This innovation lowers costs and improves safety: the rider who throws up in cars after bar-hopping, who verbally or physically abuses drivers (one Chicago cab driver told me he was held up at gunpoint several times per year), or who is constantly late will eventually have a hard time hailing an Uber or Lyft. The ratings system naturally forces out expensive riders (and ill-tempered drivers).

Interestingly, support and opposition for Uber and Google Fiber cuts across partisan lines (and across households–my wife, after hearing my argument, is not as sanguine about these upstarts). Because these companies upset long-held expectations, express or implied, strong opposition remains. Nevertheless, states and localities should welcome the rapid expansion of both Uber and Google Fiber.

The taxi registration systems and the cable franchise agreements were major regulatory mistakes. Local regulators should reduce regulations for all similarly-situated competitors and resist the temptation to remedy past errors with more distortions. Of course, there is a decades-long debate about when deregulation turns into subsidies, and this conversation applies to Uber and Google Fiber.

That debate is important, but regulators and policymakers should take every chance to roll back the rules of the past–not layer on more mandates in an ill-conceived attempt to “level the playing field.” Transportation and broadband markets are changing for the better with more competition and localities should generally stand aside.

James Barrat, author of Our Final Invention: Artificial Intelligence and the End of the Human Era, discusses the future of Artificial Intelligence (AI). Barrat takes a look at how to create friendly AI with human characteristics, which other countries are developing AI, and what we could expect with the arrival of the Singularity. He also touches on the evolution of AI and how companies like Google and IBM and government entities like DARPA and the NSA are developing artificial general intelligence devices right now.

Download

Related Links

Call it what you want: a bailout, a thumb on the scales, bidder restrictions–the FCC might conspicuously intervene in the 2015 incentive auctions at the behest of smaller carriers and public interest advocates.

Chairman Wheeler’s recent comments indicate the FCC may devise a way to prevent the largest two carriers–AT&T and Verizon–from purchasing “too much” of the television broadcasters’ spectrum at auction. AT&T likely sees the writing on the wall and argues that if there are auction limits, the restrictions should apply only to the auction, rather than more extreme restrictions that would penalize AT&T and Verizon, the largest carriers, for previously-acquired spectrum. As The Switch’s Brian Fung put it,

the small carriers favor what are called “asymmetric” spectrum caps that affect various carriers differently, while opponents prefer “symmetric” caps that don’t account for existing market positions.

While I wish AT&T put up more of a fight to auction interventions, they (and staff at the FCC) are handicapped in pursuing an unrestricted auction. The blame lies mostly with Congress who gave the FCC vague (thus ripe for abuse) and conflicting mandates spanning decades. The 1993 law authorizing auctions, for instance, requires the FCC to “avoid[] excessive concentration of licenses” and to “disseminat[e] licenses among a wide variety of applicants” among other regulatory carve-outs for smaller competitors. These latter requirements, if implemented as rigorously as smaller carriers would like, directly undermine the purpose of the 2012 American Taxpayer Relief Act that requires the upcoming spectrum auctions raise $7 billion for a public safety broadband network and $20 billion for deficit reduction.

By asymmetrically penalizing AT&T and Verizon, the FCC increases the probability the auction fails to raise the tens of billions of dollars needed (see Fred Campbell’s recent paper). I haven’t heard a policymaker speak about the incentive auction without remarking how extraordinarily complex it is. That complexity–as was made clear in this week’s Senate hearing on the subject–means no one knows how much spectrum will be auctioned off or how much money will be raised. I was doubtful the FCC would secure the called-for 120 MHz for auction in the first place, but the Senate hearing convinced me that they might not get even 60 MHz. If the FCC meddles too much and the broadcasters aren’t assured they’ll get top dollar for their spectrum, the broadcasters might not show up to sell.

For many reasons, the FCC should ignore the pressures to restrict the large carriers in bidding. Smaller carriers argue the large carriers will outbid them only to preclude competition and hoard the spectrum. Every major carrier is spending billions to expand its footprint and capacity rapidly so the hoarding argument is hard to accept (not to mention, carriers face FCC build out requirements). The hoarding argument also confounds me because AT&T and Verizon are at the forefront arguing for more spectrum auctions, particularly spectrum from federal agencies. Would they want the market flooded with new spectrum only so they could spend billions to hoard it?

Asymmetric auction restrictions also resemble a bailout for smaller carriers. T-Mobile and Sprint–who most actively lobby for auction restrictions–are not mom-and-pop establishments. Each is a sophisticated, powerful corporation with access to capital markets and backed by larger international telecoms–Germany’s Deutsche Telekom for T-Mobile and Japan’s SoftBank for Sprint. DT and SoftBank have both pledged to spend billions in the next few years to improve their American carrier’s competitive position. Such carriers do not need an FCC handout.

The bailout resemblance is more apparent when you realize Sprint has been hamstrung for nearly a decade with damaging business decisions. Three come immediately to mind: 1) the dreadful merger with Nextel in 2005; 2) the ill-fated bet in 2008 to forgo LTE rollout in favor of WiMax, a competing 4G standard; and 3) the loss of over one million customers when it discontinued its push-to-talk iDEN service for network upgrades. The losses from the Nextel merger alone approach $30 billion.

To be clear, I don’t second-guess Sprint’s decisions. They did what innovative firms are supposed to do in attempting big, risky investments. However, it should not be the job of the FCC to favor some firms through spectrum auctions because some carriers’ business decisions did not pan out. That is not a competitive wireless auction–that is an FCC-orchestrated bailout. Granted, the FCC has been handed conflicting mandates. The Commission has ample discretion, however, to conduct a competitive auction that both complies with the law and improves chances of reaching the ambitious revenue goals. Intense meddling with auction results could prove disastrous.

There is bipartisan agreement that the 1996 Telecom Act was antiquated only shortly after President Clinton’s signature had dried on the legislation. There is also consensus that spectrum policy, still largely grounded in the 1934 communications statute, absolutely distorts today’s wireless markets. And there is frequent criticism from thought leaders, right and left, that the FCC has been, for decades, too accommodating to the firms it regulates and too beholden to the status quo (economist Thomas Hazlett quips the agency’s initials stand for “Forever Captured by Corporations”).

For these reasons, members of Congress every few years announce their intention to reform the 1934 and 1996 communications laws and modernize the FCC. Yesterday, some powerful House members unexpectedly reignited hopes that Congress would overhaul our telecom, broadband, and video laws. In a Google Hangout (!), Reps. Fred Upton and Greg Walden said they wanted to take on the ambitious task of passing a new law in 2015.

Much depends on next year’s elections and the composition of Congress, but hopefully the announcement spurs a major re-write that eliminates regulatory distortions in communications, much as airlines and transportation were deregulated in the 1970s–an effort led by reformist Democrats.

About ten years ago, more than fifty scholars and technologists crafted reports which constituted the Digital Age Communications Act (or DACA) that is largely deregulatory (a majority of the group had served in Democratic administrations, interestingly enough). In 2005, then-Sen. Jim DeMint proposed a bill similar to the working group’s proposals. The working group’s recommendations aged very well in eight years–which you can’t say about the 1996 Act–and represents a great starting point for future legislation.

As Adam has said the DACA reports have five primary reform objectives:

- Replacing the amorphous “public interest” standard with a consumer welfare standard, which is more well-established in field of antitrust law

- Eliminate regulatory silos and level the playing field through deregulation

- Comprehensively reform spectrum not just through more auctioning but through clear property rights

- Reform universal service by either voucherizing it or devolving it to the States and let them run their own telecom welfare programs; and

- Significantly reforming & downsizing the scope of the FCC’s power of the modern information economy

DACA redefines the FCC as a specialized competition agency for the communications sector. The FCC largely sees itself as a competition agency today but the current statutes don’t represent that gradual change in purpose. The FCC is slow, arbitrary, Balkanizes industries artificially, and attempts to regulate in areas it isn’t equipped to regulate–the agency has a notoriously bad record in federal courts. These characteristics create a poor environment for substantial investments in technology and communications infrastructure. The DACA proposals aren’t perfect but it is a resilient framework that minimizes the effect of special interests in communications and encourages investments that improve consumers’ lives.

Both parties of Congress has been increasingly critical of federal agencies’ inefficient use of spectrum in the past few years and it seems like agencies are getting the message. The NTIA, which is the official manager of federal agency spectrum, released a letter yesterday announcing that the Department of Defense would be relocating some of its systems. Defense had reached an agreement with broadcasters that Defense systems will share spectrum in the Broadcast Auxiliary Service (BAS) band.

The soon-to-be vacated band held by Defense will eventually be auctioned off–hopefully in 2014–for billions of dollars and likely used for mobile broadband provided by wireless carriers like AT&T, Verizon, Sprint, and T-Mobile. These carriers face serious congestion problems because of government-created scarcity of spectrum.

The carriers actually had targeted some of BAS spectrum because they weren’t convinced Defense would be willing to move their systems. The broadcaster deal reached with Defense means everyone’s apparently happy–the broadcasters can keep their BAS spectrum, the feds get new equipment and Congress off their back (temporarily), and the carriers get new spectrum for auction.

The deal is welcome news because the spectrum will be put to a higher-valued use once auctioned. The federal government pays almost nothing for its own spectrum and is a poor steward of the resource. Transferring spectrum from agencies to carriers means lower phone bills and more mobile broadband coverage. Government agencies are notoriously resistant to moving their systems or sharing with others, so entering into a sharing pact with the broadcasters indicates some of the resistance is thawing.

It’s not unequivocal good news, though.

The government is clearing out from a 25 MHz band of spectrum and occupying the larger, 85 MHz BAS band that will be shared with broadcasters. The military will need a larger band because sharing imposes some capacity constraints necessitating new, agile systems that search the airwaves to make sure they don’t interfere with existing broadcast users. Dynamic sharing like this only adds to the cost and complexity and may imperil next years’ planned auction.

Further, the BAS band is unavailable for auction only because of the antiquated command-and-control regime the FCC uses to award spectrum licenses. BAS is mostly used for electronic news gathering, which relays local and national newscasts from reporters on the scene to broadcast studios. Broadcasters have used BAS spectrum since the 1960s when it was allocated to them for free.

In a market, broadcasters likely would not have as much BAS spectrum as they currently have. In fact, because of technology changes and squeezed newsroom budgets, broadcasters are finding cheaper alternatives. Increasingly, journalists are using carriers’ LTE technology to transmit their breaking newscasts since the technology costs a fraction of the cost of news vans and equipment needed for BAS transmissions. That is to say, there are alternative business models in the absence of Soviet-style allocations.

So despite these industry changes, BAS spectrum cannot be auctioned for its highest-valued use (probably mobile broadband) under current FCC rules. Further, it will be even more difficult to bring the benefits of auctions to the airwaves if federal users are intermingling with existing users, broadcasters in this case. It’s a trend to be wary of. Let’s just hope that next year’s planned auctions occur on time so that more consumers can benefit from mobile broadband.

“Net neutrality is a dead man walking,” Marvin Ammori stated in Wired last week, citing the probable demise of the FCC’s Open Internet rules in court. I’d agree for a different reason. Net neutrality has been dead ever since the FCC released its net neutrality order in December 2010. (This is not to say the damaging rules should be upheld by the DC Circuit. For many reasons, the Order should be struck down.) I agree with Ammori because we already have the Internet “fast lane” many net neutrality proponents wanted to prevent. Since that goal is precluded, all the rules do is hang Damocles’ Sword over ISPs regarding traffic management.

The 2010 rules managed to make both sides unhappy. The ISPs face severe penalties if three FCC commissioners believe ISP network management practices “unreasonably discriminate” against certain traffic. Public interest groups, on the other hand, were dissatisfied because they wanted ISPs reclassified as common carriers to prevent deep-pocketed content creators from allying with ISPs to create an Internet “fast lane” for some companies, relegating most other websites to the so-called “winding dirt road” of the public Internet.

Proponents emphasize different goals of net neutrality (to the point–many argue–it’s hard to discern what the term means). But if preventing the creation of a fast lane is the main goal of net neutrality, it’s dead already. Consider two popularly-cited net neutrality “violations” that do not violate the Open Internet Order: Netflix’ Open Connect program and Comcast not counting its Xfinity video-on-demand (VOD) service against customers’ data limits

Both cases involve the creation of a fast lane for certain content and activists rail against them. Both cases also involve network practices expressly exempted from net neutrality regulations. The FCC exempted these sorts of services because they are important, benefit the public, and should be encouraged. With Open Connect, Netflix scatters its many servers across the country closer to households, which allows its content to stream at a higher quality than most other video sites. Comcast gives its Xfinity VOD fast-lane treatment as well, which is completely legal since VOD from a cable company is a “specialized service” exempt from the rules.

“Specialized service” needs some explanation since it’s a novel concept from the FCC order. The net neutrality rules distinguish between “broadband Internet access service” (BIAS)–to which the regulations apply–and specialized (or managed) services–to which they don’t apply. The exemption of specialized services opens up a dangerous loophole in the view of proponents.

BIAS is what most consider “the Internet.” It’s the everyday websites we access on our computers and smartphones. What are specialized services? In the sleepy month of August the FCC’s Open Internet Advisory Committee released its report on what criteria specialized service needs to meet to be exempt from net neutrality scrutiny (these are influential and advisory, but not binding):

1. The service doesn’t reach large parts of the Internet, and
2. The service is an “application level” service.

The Advisory Committee also thought that “capacity isolation” is a good indicator that a service should be exempt. With capacity isolation, the ISP has one broadband connection going to the home but is separating the service’s data stream from the conventional Internet stream consumers use to visit Facebook, YouTube, and the like. This is how Comcast’s streaming of Xfinity to Xboxes is exempt–it is a proprietary network going into the home. As long as carriers don’t divert BIAS capacity for the application, the FCC will likely turn a blind eye.

What are some examples? Specialized service is marked by higher-quality streams that typically don’t suffer from jitter and latency. If you have “digital voice” from Comcast, for example, you are receiving a specialized service–proprietary VoIP. Specialized service can also include data streams like VOD, e-reader downloads, heart monitor data, and gaming services. The FCC exempted these because some are important enough that they shouldn’t compete with BIAS Internet. It would be obviously damaging to have digital phone service or health monitors getting disrupted because others are checking up on their fantasy football team. The FCC also wanted to spur investment in specialized services and video companies like Netflix are considering pairing up with ISPs to deliver a better experience to customers.

That is to say, the net neutrality effort has failed even worse than most realize. The FCC essentially prohibited innovative business models in BIAS, freezing that service into common-carrier-like status. Further, we have an Internet fast lane (which I consider a significant public benefit, though net neutrality proponents often do not). As business models evolve and the costs of server networks fall, our two-tier system will become more apparent.

Jon Brodkin at Ars Technica and Brian Fung at The Switch have posts featuring a New America Foundation study, The Cost of Connectivity 2013, comparing international prices and speeds of broadband. As I told Fung when he asked for my assessment of the study, I was left wondering whether lower prices in some European and Asian cities arise from more competition in those cities or unacknowledged tax benefits and consumer subsidies that bring the price of, say, a local fiber network down.

The report raised a few more questions in my mind, however, that I’ll outline here. Continue reading →

As you know doubt have heard, Silk Road has been shut down by the FBI and its alleged operator, Ross Ulbricht, has been arrested. I've been getting a lot of questions about this and what it means for Bitcoin. Here are some initial thoughts.

The price of Bitcoin is dropping. What does that mean? It means that speculators are speculating. That said, here's how I'm going to read it: If the main value of Bitcoin is that it can be used to buy drugs on Silk Road (as some contend), then we should see the value drop to zero is short order. If Bitcoin has other value, we should see it weather this jolt. One year ago a Bitcoin traded for about $14. As I type this, it's hovering at about $118 $127.

How did they catch the guy? Good question. I don't know the answer, but that won't stop me from speculating. I will point out two things. First is this from the criminal complaint against Ross Ulbricht:

During the course of this investigation, the FBI has located a number of computer servers, both in the United States and in multiple foreign countries, associated with the operation of Silk Road. In particular, the FBI has located in a certain foreign country the server used to host Silk Road's website (the "Silk Road Web Server"). Pursuant to a mutual Legal Assistance Treaty request, an image of the Silk Road Web Server was made on or about July 23, 2013, and produced thereafter to the FBI.

OK. So how did the FBI "locate" the servers that hosted the Silk Road Tor hidden service? The FBI has recently admitted that they have exploited vulnerabilities in Tor to identify users. Could it be that they exploited some vulnerability in this case? I look forward to finding out.

That said, here is another possibility. Also according to the criminal complaint (emphasis added),

On or about July 10, 2013, [Customs and Border Patrol] intercepted a package from the mail inbound from Canada as part of a routine border search. The package was found to contain nine counterfeit identity documents. Each of the counterfeit identification documents was in a different name yet all contained a photograph of the same person.

That person was Ulbricht and the package was addressed to him. Maybe it was from this lead that the FBI was able to begin the process of identifying the servers, once they had a suspect. If so, and if this indeed was a "routine" search, then the authorities got completely lucky!

Finally, I'll point out that Bitcoin was in no way involved in the identification of the suspect. In fact, in the criminal complaint the FBI argues that because the blockchain (Bitcoin's public ledger) is pseudonymous, that it is not useful in tracing transactions. I don't think that's quite right, but that's how the FBI sees it in this case. So, in this case at least, the privacy Bitcoin affords was not compromised in any way.

UPDATE: As I think about this some more, it's clear that the FBI was able to identify Ross Ulbricht because he posted his Gmail address to the Bitcoin Talk forum using the same username that first mentioned Silk Road ever. So, what are the chances that the CPB search that turned up the package of fake IDs bound for Ulbricht was routine? If it was routine, it was routine in the sense that packages to people on a watchlist might be routinely searched. I'm still not clear how the FBI got from identifying a possible suspect to locating the server for the Silk Road Tor hidden service.

How do you seize Bitcoins? I'm surprised by how many times I've been asked this question. It's amazing what it is that people seize upon in a story. < cough > I don't know how the authorities have carried out the seizure, but it's not to difficult to conceive how it could be done. Basically they would have to get the private keys to the suspect's Bitcoin addresses. (Think of it essentially like getting the password to an account.) They could either get that with his cooperation or if he had stored it somewhere now accessible to the authorities. Once they have the private keys, they would be able to transfer the bitcoins and I imagine that they would transfer them to a Bitcoin address that only they control.

UPDATE: So I got ahold of the seizure order and indeed I was correct that this is how the government will try to go about seizing the bitcoins. From the court order:

The United States is further authorized to seize any and all Bitcoins contained in wallet files residing on Silk Road servers, including those servers enumerate in the caption of this Complaint, pending the outcome of this civil proceeding, by transffering the full account balance in each Silk Road wallet to a public Bitcoin address controlled by the United States.

But to be clear, to seize bitcoins you do need to get the "password" that controls them. You can't just go to an intermediary and order that an account be frozen as you can do with traditional financial intermediaries like banks or PayPal.

I'll be tweeting and posting more as I learn more about what happened, but those are my initial thoughts. Shoot me any questions or thoughts you have. I'm at @jerrybrito on Twitter. And by the way, you can follow all the coverage of the Silk Road arrest and seizure on my site Mostly Bitcoin.

Sherwin Siy, Vice President of Legal Affairs at Public Knowledge, discusses emerging issues in digital copyright policy. He addresses the Department of Commerce’s recent green paper on digital copyright, including the need to reform copyright laws in light of new technologies. This podcast also covers the DMCA, online streaming, piracy, cell phone unlocking, fair use recognition, digital ownership, and what we’ve learned about copyright policy from the SOPA debate.

Download

Related Links

over-the-topCBS and Time Warner Cable have been embroiled in a heated contractual battle over the past week that has resulted in viewers in some major markets losing access to CBS programming. When disputes like these go nuclear and signal blackouts occur, it is inevitable that some folks will call for policy interventions since nobody likes it when the content they love goes dark.

While some policy responses are warranted in this matter, policymakers should proceed with caution. Heated contractual negotiations are a normal part of any capitalist marketplace. We shouldn’t expect lawmakers to intervene to speed up negotiations or set content prices because that would disrupt the normal allocation of programming by placing a regulatory thumb too heavily on one side of the scale. This is why I am somewhat sympathetic to CBS in this fight. In an age when content creators struggle to protect their copyrighted content and get compensation for it, the last thing we need is government intervention that undermines the few distribution schemes that actually work well.

On the other hand, Time Warner Cable deserves sympathy here, too, since CBS currently enjoys some preexisting regulatory benefits. As I noted in this 2012 Forbes oped, “Toward a True Free Market in Television Programming,” many layers of red tape still encumber America’s video marketplace and prevent a truly free market in video programming from developing. The battle here revolves around the “retransmission consent” rules that were put in place as part of the Cable Act of 1992 and govern how video distributors carry signals from TV broadcasters, which includes CBS.

But those “retrans” rules are not the only part of the regulatory mess here. Continue reading →