Miscellaneous

Call it what you want: a bailout, a thumb on the scales, bidder restrictions–the FCC might conspicuously intervene in the 2015 incentive auctions at the behest of smaller carriers and public interest advocates.

Chairman Wheeler’s recent comments indicate the FCC may devise a way to prevent the largest two carriers–AT&T and Verizon–from purchasing “too much” of the television broadcasters’ spectrum at auction. AT&T likely sees the writing on the wall and argues that if there are auction limits, the restrictions should apply only to the auction, rather than more extreme restrictions that would penalize AT&T and Verizon, the largest carriers, for previously-acquired spectrum. As The Switch’s Brian Fung put it,

the small carriers favor what are called “asymmetric” spectrum caps that affect various carriers differently, while opponents prefer “symmetric” caps that don’t account for existing market positions.

While I wish AT&T put up more of a fight to auction interventions, they (and staff at the FCC) are handicapped in pursuing an unrestricted auction. The blame lies mostly with Congress who gave the FCC vague (thus ripe for abuse) and conflicting mandates spanning decades. The 1993 law authorizing auctions, for instance, requires the FCC to “avoid[] excessive concentration of licenses” and to “disseminat[e] licenses among a wide variety of applicants” among other regulatory carve-outs for smaller competitors. These latter requirements, if implemented as rigorously as smaller carriers would like, directly undermine the purpose of the 2012 American Taxpayer Relief Act that requires the upcoming spectrum auctions raise $7 billion for a public safety broadband network and $20 billion for deficit reduction.

By asymmetrically penalizing AT&T and Verizon, the FCC increases the probability the auction fails to raise the tens of billions of dollars needed (see Fred Campbell’s recent paper). I haven’t heard a policymaker speak about the incentive auction without remarking how extraordinarily complex it is. That complexity–as was made clear in this week’s Senate hearing on the subject–means no one knows how much spectrum will be auctioned off or how much money will be raised. I was doubtful the FCC would secure the called-for 120 MHz for auction in the first place, but the Senate hearing convinced me that they might not get even 60 MHz. If the FCC meddles too much and the broadcasters aren’t assured they’ll get top dollar for their spectrum, the broadcasters might not show up to sell.

For many reasons, the FCC should ignore the pressures to restrict the large carriers in bidding. Smaller carriers argue the large carriers will outbid them only to preclude competition and hoard the spectrum. Every major carrier is spending billions to expand its footprint and capacity rapidly so the hoarding argument is hard to accept (not to mention, carriers face FCC build out requirements). The hoarding argument also confounds me because AT&T and Verizon are at the forefront arguing for more spectrum auctions, particularly spectrum from federal agencies. Would they want the market flooded with new spectrum only so they could spend billions to hoard it?

Asymmetric auction restrictions also resemble a bailout for smaller carriers. T-Mobile and Sprint–who most actively lobby for auction restrictions–are not mom-and-pop establishments. Each is a sophisticated, powerful corporation with access to capital markets and backed by larger international telecoms–Germany’s Deutsche Telekom for T-Mobile and Japan’s SoftBank for Sprint. DT and SoftBank have both pledged to spend billions in the next few years to improve their American carrier’s competitive position. Such carriers do not need an FCC handout.

The bailout resemblance is more apparent when you realize Sprint has been hamstrung for nearly a decade with damaging business decisions. Three come immediately to mind: 1) the dreadful merger with Nextel in 2005; 2) the ill-fated bet in 2008 to forgo LTE rollout in favor of WiMax, a competing 4G standard; and 3) the loss of over one million customers when it discontinued its push-to-talk iDEN service for network upgrades. The losses from the Nextel merger alone approach $30 billion.

To be clear, I don’t second-guess Sprint’s decisions. They did what innovative firms are supposed to do in attempting big, risky investments. However, it should not be the job of the FCC to favor some firms through spectrum auctions because some carriers’ business decisions did not pan out. That is not a competitive wireless auction–that is an FCC-orchestrated bailout. Granted, the FCC has been handed conflicting mandates. The Commission has ample discretion, however, to conduct a competitive auction that both complies with the law and improves chances of reaching the ambitious revenue goals. Intense meddling with auction results could prove disastrous.

There is bipartisan agreement that the 1996 Telecom Act was antiquated only shortly after President Clinton’s signature had dried on the legislation. There is also consensus that spectrum policy, still largely grounded in the 1934 communications statute, absolutely distorts today’s wireless markets. And there is frequent criticism from thought leaders, right and left, that the FCC has been, for decades, too accommodating to the firms it regulates and too beholden to the status quo (economist Thomas Hazlett quips the agency’s initials stand for “Forever Captured by Corporations”).

For these reasons, members of Congress every few years announce their intention to reform the 1934 and 1996 communications laws and modernize the FCC. Yesterday, some powerful House members unexpectedly reignited hopes that Congress would overhaul our telecom, broadband, and video laws. In a Google Hangout (!), Reps. Fred Upton and Greg Walden said they wanted to take on the ambitious task of passing a new law in 2015.

Much depends on next year’s elections and the composition of Congress, but hopefully the announcement spurs a major re-write that eliminates regulatory distortions in communications, much as airlines and transportation were deregulated in the 1970s–an effort led by reformist Democrats.

About ten years ago, more than fifty scholars and technologists crafted reports which constituted the Digital Age Communications Act (or DACA) that is largely deregulatory (a majority of the group had served in Democratic administrations, interestingly enough). In 2005, then-Sen. Jim DeMint proposed a bill similar to the working group’s proposals. The working group’s recommendations aged very well in eight years–which you can’t say about the 1996 Act–and represents a great starting point for future legislation.

As Adam has said the DACA reports have five primary reform objectives:

- Replacing the amorphous “public interest” standard with a consumer welfare standard, which is more well-established in field of antitrust law

- Eliminate regulatory silos and level the playing field through deregulation

- Comprehensively reform spectrum not just through more auctioning but through clear property rights

- Reform universal service by either voucherizing it or devolving it to the States and let them run their own telecom welfare programs; and

- Significantly reforming & downsizing the scope of the FCC’s power of the modern information economy

DACA redefines the FCC as a specialized competition agency for the communications sector. The FCC largely sees itself as a competition agency today but the current statutes don’t represent that gradual change in purpose. The FCC is slow, arbitrary, Balkanizes industries artificially, and attempts to regulate in areas it isn’t equipped to regulate–the agency has a notoriously bad record in federal courts. These characteristics create a poor environment for substantial investments in technology and communications infrastructure. The DACA proposals aren’t perfect but it is a resilient framework that minimizes the effect of special interests in communications and encourages investments that improve consumers’ lives.

“Net neutrality is a dead man walking,” Marvin Ammori stated in Wired last week, citing the probable demise of the FCC’s Open Internet rules in court. I’d agree for a different reason. Net neutrality has been dead ever since the FCC released its net neutrality order in December 2010. (This is not to say the damaging rules should be upheld by the DC Circuit. For many reasons, the Order should be struck down.) I agree with Ammori because we already have the Internet “fast lane” many net neutrality proponents wanted to prevent. Since that goal is precluded, all the rules do is hang Damocles’ Sword over ISPs regarding traffic management.

The 2010 rules managed to make both sides unhappy. The ISPs face severe penalties if three FCC commissioners believe ISP network management practices “unreasonably discriminate” against certain traffic. Public interest groups, on the other hand, were dissatisfied because they wanted ISPs reclassified as common carriers to prevent deep-pocketed content creators from allying with ISPs to create an Internet “fast lane” for some companies, relegating most other websites to the so-called “winding dirt road” of the public Internet.

Proponents emphasize different goals of net neutrality (to the point–many argue–it’s hard to discern what the term means). But if preventing the creation of a fast lane is the main goal of net neutrality, it’s dead already. Consider two popularly-cited net neutrality “violations” that do not violate the Open Internet Order: Netflix’ Open Connect program and Comcast not counting its Xfinity video-on-demand (VOD) service against customers’ data limits

Both cases involve the creation of a fast lane for certain content and activists rail against them. Both cases also involve network practices expressly exempted from net neutrality regulations. The FCC exempted these sorts of services because they are important, benefit the public, and should be encouraged. With Open Connect, Netflix scatters its many servers across the country closer to households, which allows its content to stream at a higher quality than most other video sites. Comcast gives its Xfinity VOD fast-lane treatment as well, which is completely legal since VOD from a cable company is a “specialized service” exempt from the rules.

“Specialized service” needs some explanation since it’s a novel concept from the FCC order. The net neutrality rules distinguish between “broadband Internet access service” (BIAS)–to which the regulations apply–and specialized (or managed) services–to which they don’t apply. The exemption of specialized services opens up a dangerous loophole in the view of proponents.

BIAS is what most consider “the Internet.” It’s the everyday websites we access on our computers and smartphones. What are specialized services? In the sleepy month of August the FCC’s Open Internet Advisory Committee released its report on what criteria specialized service needs to meet to be exempt from net neutrality scrutiny (these are influential and advisory, but not binding):

1. The service doesn’t reach large parts of the Internet, and
2. The service is an “application level” service.

The Advisory Committee also thought that “capacity isolation” is a good indicator that a service should be exempt. With capacity isolation, the ISP has one broadband connection going to the home but is separating the service’s data stream from the conventional Internet stream consumers use to visit Facebook, YouTube, and the like. This is how Comcast’s streaming of Xfinity to Xboxes is exempt–it is a proprietary network going into the home. As long as carriers don’t divert BIAS capacity for the application, the FCC will likely turn a blind eye.

What are some examples? Specialized service is marked by higher-quality streams that typically don’t suffer from jitter and latency. If you have “digital voice” from Comcast, for example, you are receiving a specialized service–proprietary VoIP. Specialized service can also include data streams like VOD, e-reader downloads, heart monitor data, and gaming services. The FCC exempted these because some are important enough that they shouldn’t compete with BIAS Internet. It would be obviously damaging to have digital phone service or health monitors getting disrupted because others are checking up on their fantasy football team. The FCC also wanted to spur investment in specialized services and video companies like Netflix are considering pairing up with ISPs to deliver a better experience to customers.

That is to say, the net neutrality effort has failed even worse than most realize. The FCC essentially prohibited innovative business models in BIAS, freezing that service into common-carrier-like status. Further, we have an Internet fast lane (which I consider a significant public benefit, though net neutrality proponents often do not). As business models evolve and the costs of server networks fall, our two-tier system will become more apparent.

Tom BrokawI think I owe Tom Brokaw an apology. When I first started reading his most recent Wall Street Journal column, “Imagine the Tweets During the Cuban Missile Crisis,” I assumed that I was in for one of those hyper-nosalgic essays about how the ‘good ‘ol days’ of mass media had passed us by and why the new media era is an unmitigated disaster. Instead, I was pleased to read his very balanced and sensible view of the old versus news media environments. Reflecting on the evolution of the media marketplace over the past 50 years since JFK’s assassination, Brokaw notes that:

The media climate has changed dramatically. The New Frontier, as Kennedy liked to call his administration, received a great deal of attention, but 50 years ago the major national information sources consisted of a handful of big-city daily newspapers, a few weekly news periodicals and two dominant TV network evening newscasts. Now the political news comes at us 24/7 on cable, through the air, the digital universe, on radio and print. And it comes to us more and more as opinion rather than a recitation of the facts as best they can be determined. News is a hit-and-run game, for the most part, with too little accountability for error.

This leads Brokaw to wonder if the amazing media metamorphosis has been, on net, positive or negative. “The virtual town square has been wired and expanded,” he notes, “but the question remains whether more voices make for a healthier political climate. With a keystroke we can easily move from an online credible source of information to a website larded with opinion or deliberately malicious erroneous claims. Have we simply enlarged the megaphone, cranked up the decibel level, and rallied the like-minded without regard to facts or consequences?” Continue reading →

Jon Brodkin at Ars Technica and Brian Fung at The Switch have posts featuring a New America Foundation study, The Cost of Connectivity 2013, comparing international prices and speeds of broadband. As I told Fung when he asked for my assessment of the study, I was left wondering whether lower prices in some European and Asian cities arise from more competition in those cities or unacknowledged tax benefits and consumer subsidies that bring the price of, say, a local fiber network down.

The report raised a few more questions in my mind, however, that I’ll outline here. Continue reading →

Timothy B. Lee, founder of The Washington Post’s blog The Switch discusses his approach to reporting at the intersection of technology and policy. He covers how to make tech concepts more accessible; the difference between blogs and the news; the importance of investigative journalism in the tech space; whether paywalls are here to stay; Jeff Bezos’ recent purchase of The Washington Post; and the future of print news.

Download

Related Links

Aereo LogoThere are few things more likely to get constituents to call their representative than TV programming blackouts, and the increase in broadcasting disruptions arising from licensing disputes in recent years means Congress may be forced to once again fix television and copyright laws. As Jerry Brito explains at Reason, the current standoff between CBS and Time Warner Cable is the result of bad regulations, which contribute to more frequent broadcaster blackouts. While each type of TV distributor (cable, satellite, broadcasters, telcos) is both disadvantaged and advantaged through regulation, broadcasters are particularly favored. As the US Copyright Office has said, the rule at issue in CBS-TWC is “part of a thicket of communications law requirements aimed at protecting and supporting the broadcast industry.”

But as we approach a damaging tipping point of rising programming costs and blackouts, Congress’ potential rescuer–Aereo–appears on the horizon, possibly buying more time before a major regulatory rewrite. Aereo, for the uninitiated, is a small online company that sets up tiny antennas in certain cities to capture broadcast television station signals–like CBS, NBC, ABC, Fox, the CW, and Univision–and streams those signals online to paying customers, who can watch live or record the local signals captured by their own “rented” Aereo antenna. Broadcasters hate this because the service deprives them of lucrative retransmission fees and unsuccessfully sued to get Aereo to cease operations. Continue reading →

Sherwin Siy, Vice President of Legal Affairs at Public Knowledge, discusses emerging issues in digital copyright policy. He addresses the Department of Commerce’s recent green paper on digital copyright, including the need to reform copyright laws in light of new technologies. This podcast also covers the DMCA, online streaming, piracy, cell phone unlocking, fair use recognition, digital ownership, and what we’ve learned about copyright policy from the SOPA debate.

Download

Related Links

This is the second of a series of three blog posts about broadband in America in response to Susan Crawford’s book Captive Audience and her recent blog post responding to positive assessments of America’s broadband marketplace in the New York Times. Read the first post here. This post addresses Crawford’s claim that every American needs fiber, regardless of the cost and that government should manage the rollout.

It is important to point out that fiber is extant in almost all broadband technologies and has been for years.  Not only are backbones built with fiber, but there is fiber to the mobile base station and fiber in cable and DSL networks.  In fact American carriers are already some of world’s biggest buyers of fiber.  They made the largest heretofore purchase in 2011, some 18 million miles of fiber optic cable.  In the last few years American firms bought more fiber optic cable than all of Europe combined.[1]

The debate is about a broadband technology called fiber to the home (FTTH).  The question is whether and how to pay for fiber from the existing infrastructure—from  the curb into the house itself as it were.  Typically the it’s the last part of the journey that can be expensive given the need to secure rights of way, eminent domain, labor cost, trenching, indoor wiring and repair costs.  Subscribers should have a say in whether the cost and disruption are warranted by the price and performance.  There is also a question of whether the technology is so essential and proven that the government should pay for it outright, or mandate that carriers provide it.

Fiber in the corporate setting is a different discussion. Many companies use private, fiber networks.  The fact of that a company or large office building offers a concentration of many subscribers paying higher fees has helped fiber grow in as the enterprise broadband choice for many companies.  Households don’t have the same economics.

There is no doubt that FTTH is a cool technology, but the love of a particular technology should not blind one to look at the economics.  After some brief background, this blog post will investigate fiber from three perspectives (1) the bandwidth requirements of web applications (2) cost of deployment and (3) substitutes and alternatives. Finally it discusses the notion of fiber as future proof.

Broadband Subscriptions in the OCED

By way of background, the OECD Broadband Portal[2] report from December 2012 notes that the US has 90 million fixed  (wired) connections, more than a quarter of the total (327 million) for 34 nations in the study.  On the mobile side, Americans have three times as many mobile broadband subscriptions as fixed.  The 280 million mobile broadband subscriptions held by Americans account for 35% of the total 780 million mobile subscriptions in the OECD. These are smartphones and devices which Americans use to the connect to the internet.

Continue reading →

roslyn-layton-247x300This week it is our pleasure to welcome Roslyn Layton to the TLF, who will be doing some guest blogging on broadband policy issues. Roslyn Layton is a PhD Fellow who studies internet economics at the Center for Communication, Media, and Information Technologies at Aalborg University in Copenhagen, Denmark.  Her program is a partnership between the Danish Department of Research & Innovation; Aalborg University, and Strand Consult, a Danish company.  Prior to her current academic position, Roslyn worked in the IT industry in the U.S., India, and Europe. Her personal page is: www.RoslynLayton.com

She’ll be rolling out three essays over the course of the week based on her extensive research research in this field, including her recent series on “10 Myths and Realities of Broadband Internet in the USA.”